WorldWideScience

Sample records for analysis techniques applied

  1. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  2. Image analysis technique applied to lock-exchange gravity currents

    Science.gov (United States)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  3. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  4. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  5. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  6. Applying contemporary statistical techniques

    CERN Document Server

    Wilcox, Rand R

    2003-01-01

    Applying Contemporary Statistical Techniques explains why traditional statistical methods are often inadequate or outdated when applied to modern problems. Wilcox demonstrates how new and more powerful techniques address these problems far more effectively, making these modern robust methods understandable, practical, and easily accessible.* Assumes no previous training in statistics * Explains how and why modern statistical methods provide more accurate results than conventional methods* Covers the latest developments on multiple comparisons * Includes recent advanc

  7. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  8. Applied ALARA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Waggoner, L.O.

    1998-02-05

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work.

  9. Wavelets, Curvelets and Multiresolution Analysis Techniques Applied to Implosion Symmetry Characterization of ICF Targets

    CERN Document Server

    Afeyan, Bedros; Starck, Jean Luc; Cuneo, Michael

    2012-01-01

    We introduce wavelets, curvelets and multiresolution analysis techniques to assess the symmetry of X ray driven imploding shells in ICF targets. After denoising X ray backlighting produced images, we determine the Shell Thickness Averaged Radius (STAR) of maximum density, r*(N, {\\theta}), where N is the percentage of the shell thickness over which to average. The non-uniformities of r*(N, {\\theta}) are quantified by a Legendre polynomial decomposition in angle, {\\theta}. Undecimated wavelet decompositions outperform decimated ones in denoising and both are surpassed by the curvelet transform. In each case, hard thresholding based on noise modeling is used. We have also applied combined wavelet and curvelet filter techniques with variational minimization as a way to select the significant coefficients. Gains are minimal over curvelets alone in the images we have analyzed.

  10. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Science.gov (United States)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  11. [The applying and foreground of quantifying DNA content by image analysis technique in determining postmortem interval].

    Science.gov (United States)

    Wang, Cheng-yi; Liu, Liang

    2002-02-01

    Image Analysis Technique(IAT) was developed at 1950's, which quantifies the changing all the part of image by sampling, processing, quantifying, computing, analyzing the information of image. And now it has become a normal quantifying technique in biology and medicine research. In the present paper, we reviewed briefly the principium of quantifying the DNA content by IAT, the law of degradation of DNA in nucleus and the foreground of this method in determining PMI in forensic pathology.

  12. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    2010-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  13. Downscaling Statistical Model Techniques for Climate Change Analysis Applied to the Amazon Region

    Directory of Open Access Journals (Sweden)

    David Mendes

    2014-01-01

    Full Text Available The Amazon is an area covered predominantly by dense tropical rainforest with relatively small inclusions of several other types of vegetation. In the last decades, scientific research has suggested a strong link between the health of the Amazon and the integrity of the global climate: tropical forests and woodlands (e.g., savannas exchange vast amounts of water and energy with the atmosphere and are thought to be important in controlling local and regional climates. Consider the importance of the Amazon biome to the global climate changes impacts and the role of the protected area in the conservation of biodiversity and state-of-art of downscaling model techniques based on ANN Calibrate and run a downscaling model technique based on the Artificial Neural Network (ANN that is applied to the Amazon region in order to obtain regional and local climate predicted data (e.g., precipitation. Considering the importance of the Amazon biome to the global climate changes impacts and the state-of-art of downscaling techniques for climate models, the shower of this work is presented as follows: the use of ANNs good similarity with the observation in the cities of Belém and Manaus, with correlations of approximately 88.9% and 91.3%, respectively, and spatial distribution, especially in the correction process, representing a good fit.

  14. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  15. Hyphenated GC-FTIR and GC-MS techniques applied in the analysis of bioactive compounds

    Science.gov (United States)

    Gosav, Steluta; Paduraru, Nicoleta; Praisler, Mirela

    2014-08-01

    The drugs of abuse, which affect human nature and cause numerous crimes, have become a serious problem throughout the world. There are hundreds of amphetamine analogues on the black market. They consist of various alterations of the basic amphetamine molecular structure, which are yet not yet included in the lists of forbidden compounds although they retain or slightly modify the hallucinogenic effects of their parent compound. It is their important variety that makes their identification quite a challenge. A number of analytical procedures for the identification of amphetamines and their analogues have recently been reported. We are presenting the profile of the main hallucinogenic amphetamines obtained with the hyphenated techniques that are recommended for the identification of illicit amphetamines, i. e. gas chromatography combined with mass spectrometry (GC-MS) and gas chromatography coupled with Fourier transform infrared spectrometry (GC-FTIR). The infrared spectra of the analyzed hallucinogenic amphetamines present some absorption bands (1490 cm-1, 1440 cm-1, 1245 cm-1, 1050 cm-1 and 940 cm-1) that are very stable as position and shape, while their intensity depends of the side-chain substitution. The specific ionic fragment of the studied hallucinogenic compounds is the 3,4-methylenedioxybenzyl cation (m/e = 135) which has a small relative abundance (lesser than 20%). The complementarity of the above mentioned techniques for the identification of hallucinogenic compounds is discussed.

  16. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual

    International Nuclear Information System (INIS)

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From this analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained

  17. Digital image processing techniques applied to pressure analysis and morphological features extraction in footprints.

    Science.gov (United States)

    Buchelly, F. J.; Mayorca, D.; Ballarín, V.; Pastore, J.

    2016-04-01

    This paper shows the correlation between foot morphology and pressure distribution on footplant by means of a morphological parameters analysis and pressure calculation. Footprint images were acquired using an optical pedobarograph and then processed for obtaining binary masks and intensity images in gray scale. Morphological descriptors were obtained from the binary images and the Hernandez Corvo (HC) index was automatically calculated for determine the type of foot. Pressure distributions were obtained from gray scale images making a correspondence between light intensity in footprints and pressure. Pressure analysis was performed by finding the maximum pressure, the mean pressure and the ratio between them that determines the uniformity of the distribution. Finally, a high correlation was found between this ratio and the type of foot determined by HC index.

  18. Morphological analysis of the flippers in the Franciscana dolphin, Pontoporia blainvillei, applying X-ray technique.

    Science.gov (United States)

    Del Castillo, Daniela Laura; Panebianco, María Victoria; Negri, María Fernanda; Cappozzo, Humberto Luis

    2014-07-01

    Pectoral flippers of cetaceans function to provide stability and maneuverability during locomotion. Directional asymmetry (DA) is a common feature among odontocete cetaceans, as well as sexual dimorphism (SD). For the first time DA, allometry, physical maturity, and SD of the flipper skeleton--by X-ray technique--of Pontoporia blainvillei were analyzed. The number of carpals, metacarpals, phalanges, and morphometric characters from the humerus, radius, ulna, and digit two were studied in franciscana dolphins from Buenos Aires, Argentina. The number of visible epiphyses and their degree of fusion at the proximal and distal ends of the humerus, radius, and ulna were also analyzed. The flipper skeleton was symmetrical, showing a negative allometric trend, with similar growth patterns in both sexes with the exception of the width of the radius (P ≤ 0.01). SD was found on the number of phalanges of digit two (P ≤ 0.01), ulna and digit two lengths. Females showed a higher relative ulna length and shorter relative digit two length, and the opposite occurred in males (P ≤ 0.01). Epiphyseal fusion pattern proved to be a tool to determine dolphin's age; franciscana dolphins with a mature flipper were, at least, four years old. This study indicates that the flippers of franciscana dolphins are symmetrical; both sexes show a negative allometric trend; SD is observed in radius, ulna, and digit two; and flipper skeleton allows determine the age class of the dolphins. PMID:24700648

  19. SOIL SEALING AND LAND USE CHANGE DETECTION APPLYING GEOGRAPHIC OBJECT BASED IMAGE ANALYSIS (GEOBIA) TECHNIQUE

    OpenAIRE

    Rabia Hammad, Ahmed Mohamed Harb

    2013-01-01

    Land use and land cover change analysis is now a mature area of study but it is still important to monitor these changes and their subsequent impacts on ecosystem functions. The rate of Land use and land cover change is much larger than ever recorded previously, with quick changes to ecosystems taking place at local to global scales. The functions of an ecosystem can be significantly impacted by changes in land use and land cover, which in turn critically affect the provision, regulation and ...

  20. A comparison of quantitative reconstruction techniques for PIXE-tomography analysis applied to biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Beasley, D.G., E-mail: dgbeasley@ctn.ist.utl.pt [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Alves, L.C. [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Barberet, Ph.; Bourret, S.; Devès, G.; Gordillo, N.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Le Trequesser, Q. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Institut de Chimie de la Matière Condensée de Bordeaux (ICMCB, UPR9048) CNRS, Université de Bordeaux, 87 avenue du Dr. A. Schweitzer, Pessac F-33608 (France); Marques, A.C. [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Silva, R.C. da [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal)

    2014-07-15

    The tomographic reconstruction of biological specimens requires robust algorithms, able to deal with low density contrast and low element concentrations. At the IST/ITN microprobe facility new GPU-accelerated reconstruction software, JPIXET, has been developed, which can significantly increase the speed of quantitative reconstruction of Proton Induced X-ray Emission Tomography (PIXE-T) data. It has a user-friendly graphical user interface for pre-processing, data analysis and reconstruction of PIXE-T and Scanning Transmission Ion Microscopy Tomography (STIM-T). The reconstruction of PIXE-T data is performed using either an algorithm based on a GPU-accelerated version of the Maximum Likelihood Expectation Maximisation (MLEM) method or a GPU-accelerated version of the Discrete Image Space Reconstruction Algorithm (DISRA) (Sakellariou (2001) [2]). The original DISRA, its accelerated version, and the MLEM algorithm, were compared for the reconstruction of a biological sample of Caenorhabditis elegans – a small worm. This sample was analysed at the microbeam line of the AIFIRA facility of CENBG, Bordeaux. A qualitative PIXE-T reconstruction was obtained using the CENBG software package TomoRebuild (Habchi et al. (2013) [6]). The effects of pre-processing and experimental conditions on the elemental concentrations are discussed.

  1. Analysis and Design of International Emission Trading Markets Applying System Dynamics Techniques

    Science.gov (United States)

    Hu, Bo; Pickl, Stefan

    2010-11-01

    The design and analysis of international emission trading markets is an important actual challenge. Time-discrete models are needed to understand and optimize these procedures. We give an introduction into this scientific area and present actual modeling approaches. Furthermore, we develop a model which is embedded in a holistic problem solution. Measures for energy efficiency are characterized. The economic time-discrete "cap-and-trade" mechanism is influenced by various underlying anticipatory effects. With a systematic dynamic approach the effects can be examined. First numerical results show that fair international emissions trading can only be conducted with the use of protective export duties. Furthermore a comparatively high price which evokes emission reduction inevitably has an inhibiting effect on economic growth according to our model. As it always has been expected it is not without difficulty to find a balance between economic growth and emission reduction. It can be anticipated using our System Dynamics model simulation that substantial changes must be taken place before international emissions trading markets can contribute to global GHG emissions mitigation.

  2. Analysis of Arbitrary Reflector Antennas Applying the Geometrical Theory of Diffraction Together with the Master Points Technique

    Directory of Open Access Journals (Sweden)

    María Jesús Algar

    2013-01-01

    Full Text Available An efficient approach for the analysis of surface conformed reflector antennas fed arbitrarily is presented. The near field in a large number of sampling points in the aperture of the reflector is obtained applying the Geometrical Theory of Diffraction (GTD. A new technique named Master Points has been developed to reduce the complexity of the ray-tracing computations. The combination of both GTD and Master Points reduces the time requirements of this kind of analysis. To validate the new approach, several reflectors and the effects on the radiation pattern caused by shifting the feed and introducing different obstacles have been considered concerning both simple and complex geometries. The results of these analyses have been compared with the Method of Moments (MoM results.

  3. Development of applied optical techniques

    International Nuclear Information System (INIS)

    The objective of this project is to improve laser application techniques in nuclear industry. A small,light and portable laser induced fluorometer was developed. It was designed to compensate inner filter and quenching effects by on-line data processing during analysis of uranium in aqueous solution. Computer interface improves the accuracy and data processing capabilities of the instrument. Its detection limit is as low as 0.1 ppb of uranium. It is ready to use in routine chemical analysis. The feasible applications such as for uranium level monitoring in discards from reconversion plant or fuel fabrication plant were seriously considered with minor modification of the instrument. It will be used to study trace analysis of rare-earth elements. The IRMPD of CHF3 was carried out and the effects of buffer gases such as Ar,N2 and SF6 were investigated. The IRMPD rate was increased with increasing pressure of the reactant and buffer gases. The pressure effect of the reactant CHF3 below 0.1 Torr showed opposite results. It was considered that the competition between quenching effect and rotational hole-filling effect during intermolecular collisions plays a great role in this low pressure region. The applications of holography in nuclear fuel cycle facilities were surveyed and analyzed. Also, experimental apparatuses such as an Ar ion laser, various kinds of holographic films and several optical components were prepared. (Author)

  4. Multivariate class modeling techniques applied to multielement analysis for the verification of the geographical origin of chili pepper.

    Science.gov (United States)

    Naccarato, Attilio; Furia, Emilia; Sindona, Giovanni; Tagarelli, Antonio

    2016-09-01

    Four class-modeling techniques (soft independent modeling of class analogy (SIMCA), unequal dispersed classes (UNEQ), potential functions (PF), and multivariate range modeling (MRM)) were applied to multielement distribution to build chemometric models able to authenticate chili pepper samples grown in Calabria respect to those grown outside of Calabria. The multivariate techniques were applied by considering both all the variables (32 elements, Al, As, Ba, Ca, Cd, Ce, Co, Cr, Cs, Cu, Dy, Fe, Ga, La, Li, Mg, Mn, Na, Nd, Ni, Pb, Pr, Rb, Sc, Se, Sr, Tl, Tm, V, Y, Yb, Zn) and variables selected by means of stepwise linear discriminant analysis (S-LDA). In the first case, satisfactory and comparable results in terms of CV efficiency are obtained with the use of SIMCA and MRM (82.3 and 83.2% respectively), whereas MRM performs better than SIMCA in terms of forced model efficiency (96.5%). The selection of variables by S-LDA permitted to build models characterized, in general, by a higher efficiency. MRM provided again the best results for CV efficiency (87.7% with an effective balance of sensitivity and specificity) as well as forced model efficiency (96.5%). PMID:27041319

  5. Multivariate class modeling techniques applied to multielement analysis for the verification of the geographical origin of chili pepper.

    Science.gov (United States)

    Naccarato, Attilio; Furia, Emilia; Sindona, Giovanni; Tagarelli, Antonio

    2016-09-01

    Four class-modeling techniques (soft independent modeling of class analogy (SIMCA), unequal dispersed classes (UNEQ), potential functions (PF), and multivariate range modeling (MRM)) were applied to multielement distribution to build chemometric models able to authenticate chili pepper samples grown in Calabria respect to those grown outside of Calabria. The multivariate techniques were applied by considering both all the variables (32 elements, Al, As, Ba, Ca, Cd, Ce, Co, Cr, Cs, Cu, Dy, Fe, Ga, La, Li, Mg, Mn, Na, Nd, Ni, Pb, Pr, Rb, Sc, Se, Sr, Tl, Tm, V, Y, Yb, Zn) and variables selected by means of stepwise linear discriminant analysis (S-LDA). In the first case, satisfactory and comparable results in terms of CV efficiency are obtained with the use of SIMCA and MRM (82.3 and 83.2% respectively), whereas MRM performs better than SIMCA in terms of forced model efficiency (96.5%). The selection of variables by S-LDA permitted to build models characterized, in general, by a higher efficiency. MRM provided again the best results for CV efficiency (87.7% with an effective balance of sensitivity and specificity) as well as forced model efficiency (96.5%).

  6. Development of applied optical techniques

    International Nuclear Information System (INIS)

    This report resents the status of researches on the applications of lasers at KAERI. A compact portable laser fluorometer detecting uranium desolved in aqueous solution was built. The laser-induced fluorescence of uranium was detected with a photomultiplier tube. A delayed gate circuit and an integrating circuit were used to process the electrical signal. A small nitrogen laser was used to excite uranium. The detecting limit is about 0.1 ppb. The effect of various acidic solutions was investigated. Standard addition technique was incorporated to improve the measuring accuracy. This instrument can be used for safety inspection of workers in the nuclear fuel cycle facilities. (Author)

  7. Handbook of Applied Analysis

    CERN Document Server

    Papageorgiou, Nikolaos S

    2009-01-01

    Offers an examination of important theoretical methods and procedures in applied analysis. This book details the important theoretical trends in nonlinear analysis and applications to different fields. It is suitable for those working on nonlinear analysis.

  8. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  9. Mass retrieval and a posteriori error analysis using non-linear inverse modelling techniques applied to atmospheric tracers

    International Nuclear Information System (INIS)

    Full text: An account on recent progress in the inverse modelling of pollutants with first order chemistry (tracers, radionuclides, etc.) is given. The methods, which are variational and whose general principles have been presented in earlier EGU assemblies, are fundamentally nonlinear. They are meant to perform efficiently on the reconstruction of sources of accidental type (typically ETEX, Chernobyl.) In this report, the emphasis is put on two recent developments: First, the problem of finding the total released mass through nonlinear methods is first studied. Because the positivity of sources is taken into account, a non-zero prior mass scale appears in the background term. The cost function has a nonquadratic dependence on this parameter. This rules out many known parameter estimation techniques, when one has to choose this parameter prior to the inversion. Secondly, a posterior analysis of the retrieved source and retrieved errors is then conducted. It generalizes well-known data assimilation results that make use of the Hessian of the cost function. Closely related is the second order sensitivity analysis on the source and retrieved errors, of special significance for inverse modelling. (author)

  10. New technique of insitu soil moisture sampling for environmental isotope analysis applied at 'Pilat-dune' near Bordeaux

    International Nuclear Information System (INIS)

    A new soil-air suction method with soil water vapor adsorption by 4 A-molecular sieve provides soil moisture samples from various depths for environmental isotope analysis and yields soil temperature profiles. A field tritium tracer experiment shows that this insitu sampling method has an isotope profile resolution of about 5-10 cm only. Application of this method in the Pilat sand dune (Bordeaux/France) yielded deuterium and tritium profiles down to 25 meters depth. Bomb tritium measurements of monthly lysimeter percolate samples available since 1961 show that the tritium response has a mean delay of 5 months in case of a sand lysimeter and of 2.5 years for a loess loam lysimeter. A simple HETP model simulates the layered downward movement of soil water and the longitudinal dispersion in the lysimeters. Field capacity and evapotranspiration taken as open parameters yield tritium concentration values of the lysimeters' percolate which are in close agreement with the experimental results. Based on local meteorological data the HETP model applied to tritium tracer experiments in the unsaturated zone further yiels an individual prediction of the momentary tracer position and of the soil moisture distribution. This prediction can be checked experimentally at selected intervals by coring. (orig.)

  11. Functional reasoning, explanation and analysis: Part 1: a survey on theories, techniques and applied systems. Part 2: qualitative function formation technique

    International Nuclear Information System (INIS)

    Functional Reasoning (FR) enables people to derive the purpose of objects and explain their functions, JAERI's 'Human Acts Simulation Program (HASP)', started from 1987, has the goal of developing programs of the underlying technologies for intelligent robots by imitating the intelligent behavior of humans. FR is considered a useful reasoning method in HASP and applied to understand function of tools and objects in the Toolbox Project. In this report, first, the results of the diverse FR researches within a variety of disciplines are reviewed and the common core and basic problems are identified. Then the qualitative function formation (QFF) technique is introduced. Some novel points are: extending the common qualitative models to include interactions and timing of events by defining temporal and dependency constraints, and binding it with the conventional qualitative simulation. Function concepts are defined as interpretations of either a persistence or an order in the sequence of states, using the trace of the qualitative state vector derived by qualitative simulation on the extended qualitative model. This offers solution to some of the FR problems and leads to a method for generalization and comparison of functions of different objects. (author) 85 refs

  12. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  13. Computational optimization techniques applied to microgrids planning

    DEFF Research Database (Denmark)

    Gamarra, Carlos; Guerrero, Josep M.

    2015-01-01

    ), their planning process must be addressed to economic feasibility, as a long-term stability guarantee. Planning a microgrid is a complex process due to existing alternatives, goals, constraints and uncertainties. Usually planning goals conflict each other and, as a consequence, different optimization problems...... appear along the planning process. In this context, technical literature about optimization techniques applied to microgrid planning have been reviewed and the guidelines for innovative planning methodologies focused on economic feasibility can be defined. Finally, some trending techniques and new...... microgrid planning approaches are pointed out....

  14. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  15. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  16. Statistical techniques applied to aerial radiometric surveys (STAARS): principal components analysis user's manual. [NURE program

    Energy Technology Data Exchange (ETDEWEB)

    Koch, C.D.; Pirkle, F.L.; Schmidt, J.S.

    1981-01-01

    A Principal Components Analysis (PCA) has been written to aid in the interpretation of multivariate aerial radiometric data collected by the US Department of Energy (DOE) under the National Uranium Resource Evaluation (NURE) program. The variations exhibited by these data have been reduced and classified into a number of linear combinations by using the PCA program. The PCA program then generates histograms and outlier maps of the individual variates. Black and white plots can be made on a Calcomp plotter by the application of follow-up programs. All programs referred to in this guide were written for a DEC-10. From this analysis a geologist may begin to interpret the data structure. Insight into geological processes underlying the data may be obtained.

  17. Gas chromatography/ion trap mass spectrometry applied for the analysis of triazine herbicides in environmental waters by an isotope dilution technique

    International Nuclear Information System (INIS)

    A gas chromatography/ion trap mass spectrometry method was developed for the analysis of simazine, atrazine, cyanazine, as well as the degradation products of atrazine, such as deethylatrazine and deisopropylatrazine in environmental water samples. Isotope dilution technique was applied for the quantitative analysis of atrazine in water at low ng/l levels. One liter of water sample spiked with stable isotope internal standard atrazine-d5 was extracted with a C18 solid-phase extraction cartridge. The analysis was performed on an ion trap mass spectrometer operated in MS/MS method. The extraction recoveries were in the range of 83-94% for the triazine herbicides in water at the concentrations of 24, 200, and 1000 ng/l, while poor recoveries were obtained for the degradation products of atrazine. The relative standard deviation (R.S.D.) were within the range of 3.2-16.1%. The detection limits of the method were between 0.75 and 12 ng/l when 1 l of water was analyzed. The method was successfully applied to analyze environmental water samples collected from a reservoir and a river in Hong Kong for atrazine detected at concentrations between 3.4 and 26 ng/l

  18. Applying DEA Technique to Library Evaluation in Academic Research Libraries.

    Science.gov (United States)

    Shim, Wonsik

    2003-01-01

    This study applied an analytical technique called Data Envelopment Analysis (DEA) to calculate the relative technical efficiency of 95 academic research libraries, all members of the Association of Research Libraries. DEA, with the proper model of library inputs and outputs, can reveal best practices in the peer groups, as well as the technical…

  19. Nuclear analytical techniques applied to forensic chemistry

    International Nuclear Information System (INIS)

    Gun shot residues produced by firing guns are mainly composed by visible particles. The individual characterization of these particles allows distinguishing those ones containing heavy metals, from gun shot residues, from those having a different origin or history. In this work, the results obtained from the study of gun shot residues particles collected from hands are presented. The aim of the analysis is to establish whether a person has shot a firing gun has been in contact with one after the shot has been produced. As reference samples, particles collected hands of persons affected to different activities were studied to make comparisons. The complete study was based on the application of nuclear analytical techniques such as Scanning Electron Microscopy, Energy Dispersive X Ray Electron Probe Microanalysis and Graphite Furnace Atomic Absorption Spectrometry. The essays allow to be completed within time compatible with the forensic requirements. (author)

  20. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  1. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  2. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    Science.gov (United States)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  3. Applying Mixed Methods Techniques in Strategic Planning

    Science.gov (United States)

    Voorhees, Richard A.

    2008-01-01

    In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…

  4. Eastside, Westside... An Exercise in Applying Document Analysis Techniques in Educational Evaluation. Research on Evaluation Program Paper and Report Series. No. 78. Interim Draft.

    Science.gov (United States)

    Garman, Keats

    This booklet is about document analysis and its utility as a method in education evaluation, and is intended for evaluators in local school districts, regional education agencies, and state departments of education. Document analysis is described as a technique that relies heavily upon a variety of written materials for data, insights, and…

  5. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  6. Digital Speckle Technique Applied to Flow Visualization

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Digital speckle technique uses a laser, a CCD camera, and digital processing to generate interference fringes at the television framing rate. Its most obvious advantage is that neither darkroom facilities nor photographic wet chemical processing is required. In addition, it can be used in harsh engineering environments. This paper discusses the strengths and weaknesses of three digital speckle methodologies. (1) Digital speckle pattern interferometry (DSPI) uses an optical polarization phase shifter for visualization and measurement of the density field in a flow field. (2) Digital shearing speckle interferometry (DSSI) utilizes speckle-shearing interferometry in addition to optical polarization phase shifting. (3) Digital speckle photography (DSP) with computer reconstruction. The discussion describes the concepts, the principles and the experimental arrangements with some experimental results. The investigation shows that these three digital speckle techniques provide an excellent method for visualizing flow fields and for measuring density distributions in fluid mechanics and thermal flows.

  7. Applying Cooperative Techniques in Teaching Problem Solving

    Directory of Open Access Journals (Sweden)

    Krisztina Barczi

    2013-12-01

    Full Text Available Teaching how to solve problems – from solving simple equations to solving difficult competition tasks – has been one of the greatest challenges for mathematics education for many years. Trying to find an effective method is an important educational task. Among others, the question arises as to whether a method in which students help each other might be useful. The present article describes part of an experiment that was designed to determine the effects of cooperative teaching techniques on the development of problem-solving skills.

  8. Basic principles of applied nuclear techniques

    International Nuclear Information System (INIS)

    The technological applications of radioactive isotopes and radiation in South Africa have grown steadily since the first consignment of man-made radioisotopes reached this country in 1948. By the end of 1975 there were 412 authorised non-medical organisations (327 industries) using hundreds of sealed sources as well as their fair share of the thousands of radioisotope consignments, annually either imported or produced locally (mainly for medical purposes). Consequently, it is necessary for South African technologists to understand the principles of radioactivity in order to appreciate the industrial applications of nuclear techniques

  9. Technique applied in electrical power distribution for Satellite Launch Vehicle

    Directory of Open Access Journals (Sweden)

    João Maurício Rosário

    2010-09-01

    Full Text Available The Satellite Launch Vehicle electrical network, which is currently being developed in Brazil, is sub-divided for analysis in the following parts: Service Electrical Network, Controlling Electrical Network, Safety Electrical Network and Telemetry Electrical Network. During the pre-launching and launching phases, these electrical networks are associated electrically and mechanically to the structure of the vehicle. In order to succeed in the integration of these electrical networks it is necessary to employ techniques of electrical power distribution, which are proper to Launch Vehicle systems. This work presents the most important techniques to be considered in the characterization of the electrical power supply applied to Launch Vehicle systems. Such techniques are primarily designed to allow the electrical networks, when submitted to the single-phase fault to ground, to be able of keeping the power supply to the loads.

  10. Applied techniques for mining natural proteasome inhibitors.

    Science.gov (United States)

    Stein, Martin L; Groll, Michael

    2014-01-01

    In eukaryotic cells, the ubiquitin-proteasome-system (UPS) is responsible for the non-lysosomal degradation of proteins and plays a pivotal role in such vital processes as protein homeostasis, antigen processing or cell proliferation. Therefore, it is an attractive drug target with various applications in cancer and immunosuppressive therapies. Being an evolutionary well conserved pathway, many pathogenic bacteria have developed small molecules, which modulate the activity of their hosts' UPS components. Such natural products are, due to their stepwise optimization over the millennia, highly potent in terms of their binding mechanisms, their bioavailability and selectivity. Generally, this makes bioactive natural products an ideal starting point for the development of novel drugs. Since four out of the ten best seller drugs are natural product derivatives, research in this field is still of unfathomable value for the pharmaceutical industry. The currently most prominent example for the successful exploitation of a natural compound in the UPS field is carfilzomib (Kyprolis®), which represents the second FDA approved drug targeting the proteasome after the admission of the blockbuster bortezomib (Velcade®) in 2003. On the other hand side of the spectrum, ONX 0914, which is derived from the same natural product as carfilzomib, has been shown to selectively inhibit the immune response related branch of the pathway. To date, there exists a huge potential of UPS inhibitors with regard to many diseases. Both approved drugs against the proteasome show severe side effects, adaptive resistances and limited applicability, thus the development of novel compounds with enhanced properties is a main objective of active research. In this review, we describe the techniques, which can be utilized for the discovery of novel natural inhibitors, which in particular block the 20S proteasomal activity. In addition, we will illustrate the successful implementation of a recently

  11. Conversation Analysis in Applied Linguistics

    DEFF Research Database (Denmark)

    Kasper, Gabriele; Wagner, Johannes

    2014-01-01

    For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...... been driven by applied work. After laying out CA's standard practices of data treatment and analysis, this article takes up the role of comparison as a fundamental analytical strategy and reviews recent developments into cross-linguistic and cross-cultural directions. The remaining article focuses...... on learning and development. In conclusion, we address some emerging themes in the relationship of CA and applied linguistics, including the role of multilingualism, standard social science methods as research objects, CA's potential for direct social intervention, and increasing efforts to complement CA...

  12. Applied analysis and differential equations

    CERN Document Server

    Cârj, Ovidiu

    2007-01-01

    This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.

  13. Analytical techniques applied to study cultural heritage objects

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N., E-mail: rizzutto@if.usp.br [Universidade de Sao Paulo (USP), SP (Brazil). Instituto de Fisica

    2015-07-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  14. Hyperspectral-imaging-based techniques applied to wheat kernels characterization

    Science.gov (United States)

    Serranti, Silvia; Cesare, Daniela; Bonifazi, Giuseppe

    2012-05-01

    Single kernels of durum wheat have been analyzed by hyperspectral imaging (HSI). Such an approach is based on the utilization of an integrated hardware and software architecture able to digitally capture and handle spectra as an image sequence, as they results along a pre-defined alignment on a surface sample properly energized. The study was addressed to investigate the possibility to apply HSI techniques for classification of different types of wheat kernels: vitreous, yellow berry and fusarium-damaged. Reflectance spectra of selected wheat kernels of the three typologies have been acquired by a laboratory device equipped with an HSI system working in near infrared field (1000-1700 nm). The hypercubes were analyzed applying principal component analysis (PCA) to reduce the high dimensionality of data and for selecting some effective wavelengths. Partial least squares discriminant analysis (PLS-DA) was applied for classification of the three wheat typologies. The study demonstrated that good classification results were obtained not only considering the entire investigated wavelength range, but also selecting only four optimal wavelengths (1104, 1384, 1454 and 1650 nm) out of 121. The developed procedures based on HSI can be utilized for quality control purposes or for the definition of innovative sorting logics of wheat.

  15. Tensometry technique for X-ray diffraction in applied analysis of welding; Tensometria por tecnica de difracao de raios X aplicada na analise de soldagens

    Energy Technology Data Exchange (ETDEWEB)

    Turibus, S.N.; Caldas, F.C.M.; Miranda, D.M.; Monine, V.I.; Assis, J.T., E-mail: snturibus@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (IPRJ/UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico

    2010-07-01

    This paper presents the analysis of residual stress introduced in welding process. As the stress in a material can induce damages, it is necessary to have a method to identify this residual stress state. For this it was used the non-destructive X-ray diffraction technique to analyze two plates from A36 steel jointed by metal inert gas (MIG) welding. The stress measurements were made by the sin{sup 2{psi}} method in weld region of steel plates including analysis of longitudinal and transverse residual stresses in fusion zone, heat affected zone (HAZ) and base metal. To determine the stress distribution along the depth of the welded material it was used removing of superficial layers made by electropolishing. (author)

  16. Radiation measurement and inverse analysis techniques applied on the determination of the apparent mass diffusion coefficient for diverse contaminants and soil samples

    International Nuclear Information System (INIS)

    Full text of publication follows: The study of the dispersion of radioactive materials in soils and in engineering barriers plays an important role in the safety analysis of nuclear waste repositories. In order to proceed with such kind of study the involved physical properties must be determined with precision, including the apparent mass diffusion coefficient, which is defined as the ratio between the effective mass diffusion coefficient and the retardation factor. Many different experimental and estimation techniques are available on the literature for the identification of the diffusion coefficient and this work describes the implementation of that developed by Pereira et al [1]. This technique is based on non-intrusive radiation measurements and the experimental setup consists of a cylindrical column filled with compacted media saturated with water. A radioactive contaminant is mixed with a portion of the media and then placed in the bottom of the column. Therefore, the contaminant will diffuse through the uncontaminated media due to the concentration gradient. A radiation detector is used to measure the number of counts, which is associated to the contaminant concentration, at several positions along the column during the experiment. Such measurements are then used to estimate the apparent diffusion coefficient of the contaminant in the porous media by inverse analysis. The inverse problem of parameter estimation is solved with the Levenberg-Marquart Method of minimization of the least-square norm. The experiment was optimized with respect to the number of measurement locations, frequency of measurements and duration of the experiment through the analysis of the sensitivity coefficients and by using a D-optimum approach. This setup is suitable for studying a great number of combinations of diverse contaminants and porous media varying in composition and compacting, with considerable easiness and reliable results, and it was chosen because that is the

  17. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  18. Conversation Analysis and Applied Linguistics.

    Science.gov (United States)

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  19. Sneak analysis applied to process systems

    Science.gov (United States)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  20. Sensor Data Qualification Technique Applied to Gas Turbine Engines

    Science.gov (United States)

    Csank, Jeffrey T.; Simon, Donald L.

    2013-01-01

    This paper applies a previously developed sensor data qualification technique to a commercial aircraft engine simulation known as the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k). The sensor data qualification technique is designed to detect, isolate, and accommodate faulty sensor measurements. It features sensor networks, which group various sensors together and relies on an empirically derived analytical model to relate the sensor measurements. Relationships between all member sensors of the network are analyzed to detect and isolate any faulty sensor within the network.

  1. Manifold learning techniques and model reduction applied to dissipative PDEs

    OpenAIRE

    Sonday, Benjamin E.; Singer, Amit; Gear, C. William; Kevrekidis, Ioannis G.

    2010-01-01

    We link nonlinear manifold learning techniques for data analysis/compression with model reduction techniques for evolution equations with time scale separation. In particular, we demonstrate a `"nonlinear extension" of the POD-Galerkin approach to obtaining reduced dynamic models of dissipative evolution equations. The approach is illustrated through a reaction-diffusion PDE, and the performance of different simulators on the full and the reduced models is compared. We also discuss the relati...

  2. Evaluation via multivariate techniques of scale factor variability in the rietveld method applied to quantitative phase analysis with X ray powder diffraction

    Directory of Open Access Journals (Sweden)

    Terezinha Ferreira de Oliveira

    2006-12-01

    Full Text Available The present work uses multivariate statistical analysis as a form of establishing the main sources of error in the Quantitative Phase Analysis (QPA using the Rietveld method. The quantitative determination of crystalline phases using x ray powder diffraction is a complex measurement process whose results are influenced by several factors. Ternary mixtures of Al2O3, MgO and NiO were prepared under controlled conditions and the diffractions were obtained using the Bragg-Brentano geometric arrangement. It was possible to establish four sources of critical variations: the experimental absorption and the scale factor of NiO, which is the phase with the greatest linear absorption coefficient of the ternary mixture; the instrumental characteristics represented by mechanical errors of the goniometer and sample displacement; the other two phases (Al2O3 and MgO; and the temperature and relative humidity of the air in the laboratory. The error sources excessively impair the QPA with the Rietveld method. Therefore it becomes necessary to control them during the measurement procedure.

  3. Volcanic Monitoring Techniques Applied to Controlled Fragmentation Experiments

    Science.gov (United States)

    Kueppers, U.; Alatorre-Ibarguengoitia, M. A.; Hort, M. K.; Kremers, S.; Meier, K.; Scharff, L.; Scheu, B.; Taddeucci, J.; Dingwell, D. B.

    2010-12-01

    ejection and that the evaluated results were mostly in good agreement. We will discuss the technical difficulties encountered, e.g. the temporal synchronisation of the different techniques. Furthermore, the internal data management of the DR prevents at present a continuous recording and only a limited number of snapshots is stored. Nonetheless, in at least three experiments the onset of particle ejection was measured by all different techniques and gave coherent results of up to 100 m/s. This is a very encouraging result and of paramount importance as it proofs the applicability of these independent methods to volcano monitoring. Each method by itself may enhance our understanding of the pressurisation state of a volcano, an essential factor in ballistic hazard evaluation and eruption energy estimation. Technical adaptations of the DR will overcome the encountered problems and allow a more refined data analysis during the next campaign.

  4. Applying Supervised Opinion Mining Techniques on Online User Reviews

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2012-01-01

    Full Text Available In recent years, the spectacular development of web technologies, lead to an enormous quantity of user generated information in online systems. This large amount of information on web platforms make them viable for use as data sources, in applications based on opinion mining and sentiment analysis. The paper proposes an algorithm for detecting sentiments on movie user reviews, based on naive Bayes classifier. We make an analysis of the opinion mining domain, techniques used in sentiment analysis and its applicability. We implemented the proposed algorithm and we tested its performance, and suggested directions of development.

  5. DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Food scientists use standards and calibrations to relate the concentration of a compound of interest to the instrumental response. The techniques used include classical, single point, and inverse calibrations, as well as standard addition and internal standards. Several fundamental criteria -- sel...

  6. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  7. Three-dimensional integrated CAE system applying computer graphic technique

    International Nuclear Information System (INIS)

    A three-dimensional CAE system for nuclear power plant design is presented. This system utilizes high-speed computer graphic techniques for the plant design review, and an integrated engineering database for handling the large amount of nuclear power plant engineering data in a unified data format. Applying this system makes it possible to construct a nuclear power plant using only computer data from the basic design phase to the manufacturing phase, and it increases the productivity and reliability of the nuclear power plants. (author)

  8. Essentials of applied dynamic analysis

    CERN Document Server

    Jia, Junbo

    2014-01-01

    This book presents up-to-date knowledge of dynamic analysis in engineering world. To facilitate the understanding of the topics by readers with various backgrounds, general principles are linked to their applications from different angles. Special interesting topics such as statistics of motions and loading, damping modeling and measurement, nonlinear dynamics, fatigue assessment, vibration and buckling under axial loading, structural health monitoring, human body vibrations, and vehicle-structure interactions etc., are also presented. The target readers include industry professionals in civil, marine and mechanical engineering, as well as researchers and students in this area.

  9. CONSUMER BEHAVIOR ANALYSIS BY GRAPH MINING TECHNIQUE

    OpenAIRE

    KATSUTOSHI YADA; HIROSHI MOTODA; TAKASHI WASHIO; ASUKA MIYAWAKI

    2006-01-01

    In this paper, we discuss how graph mining system is applied to sales transaction data so as to understand consumer behavior. First, existing research of consumer behavior analysis for sequential purchase pattern is reviewed. Then we propose to represent the complicated customer purchase behavior by a directed graph retaining temporal information in a purchase sequence and apply a graph mining technique to analyze the frequent occurring patterns. In this paper, we demonstrate through the case...

  10. Data Mining E-protokol - Applying data mining techniques on student absence

    OpenAIRE

    Shrestha, Amardip; Bro Lilleås, Lauge; Hansen, Asbjørn

    2014-01-01

    The scope of this project is to explore the possibilities in applying data mining techniques for discovering new knowledge about student absenteeism in primary school. The research consists in analyzing a large dataset collected through the digital protocol system E-protokol. The data mining techniques used for the analysis involves clustering, classification and association rule mining, which are utilized using the machine learning toolset WEKA. The findings includes a number of suggestions ...

  11. Neutrongraphy technique applied to the narcotics and terrorism enforcement

    International Nuclear Information System (INIS)

    Among the several methods of non-destructive essays that may be used for the detection of both drugs and explosives, the ones that utilize nuclear techniques have demonstrated to possess essential qualities for an efficient detection system. These techniques allow the inspection of a large quantity of samples fast, sensibly, specifically and with automatic decision, for they utilize radiation of great power of penetration. This work aims to show the neutron radiography and computed tomography potentiality for the detection of the drugs and explosives even when they are concealed by heavy materials. In the radiographic essays with thermal neutrons, samples of powder cocaine and explosives were inspected, concealed by several materials or not. The samples were irradiated during 30 minutes in the J-9 channel of the Argonauta research reactor of the IEN/CNEN in a neutron flux of 2:5 105 n/cm2.s. We used two sheets of gadolinium converter with a thickness of 25 μm each one and a Kodak Industrex A5 photographic plaque. A comparative analysis among the tomographic images experimental and simulated obtained by X-ray, fast and thermal neutron is presented. The thermal neutron tomography demonstrate to be the best. (author)

  12. Free Radical Imaging Techniques Applied to Hydrocarbon Flames Diagnosis

    Institute of Scientific and Technical Information of China (English)

    A. Caldeira-Pires

    2001-01-01

    This paper evaluates the utilization of free radical chemiluminescence imaging and tomographic reconstruction techniques to assess advanced information on reacting flows. Two different laboratory flow configurations were analyzed, including unconfined non-premixed jet flame measurements to evaluate flame fuel/air mixing patterns at the burner-port of a typical glass-furnace burner. The second case characterized the reaction zone of premixed flames within gas turbine combustion chambers, based on a laboratory scale model of a lean prevaporized premixed (LPP) combustion chamber.The analysis shows that advanced imaging diagnosis can provide new information on the characterization of flame mixing and reacting phenomena. The utilization of local C2 and CH chemiluminescence can assess useful information on the quality of the combustion process, which can be used to improve the design of practical combustors.

  13. Técnicas de mineração visual de dados aplicadas aos dados de instrumentação da barragem de Itaipu Visual data mining techniques applied for the analysis of data collected at Itaipu power plant

    Directory of Open Access Journals (Sweden)

    Marco Aurélio Silva Neto

    2010-12-01

    information may be more easily extracted when different techniques of Visualization of Information, together with techniques of Data Mining, are applied for data analysis. The visual analysis of the data has proved efficient in detecting patterns of anomalies, and thus it can be considered a valuable tool to support decision making.

  14. ESR dating technique applied to Pleistocene Corals (Barbados Island)

    International Nuclear Information System (INIS)

    In this work we applied the ESR (Electron Spin Resonance) dating technique to a coral coming from Barbados island. After a preliminary purification treatment, coral samples were milled and separated in different granulometry groups. Powder samples having granulometry values between 125 μm-250 μm and 250 μm-500 μm were irradiated at the Calliope60 Co radioisotope source (R.C. ENEA-Casaccia) at doses between 10-3300 Gγ and their radiation induced ESR signals were measured by a Bruker EMS1O4 spectrometer. The signal/noise ratio turned to be highest far the granulometry between 250 μm-500 μm and consequently the paleo-curve was constructed by using the ESR signals related to this granulometry value. The paleo-curve was fitted with the exponential growth function y = a - b · e-cx which well describes the behaviour of the curve also in the saturation region. Extrapolating the paleo-dose and knowing the annual dose (999±79 μGy/y) we calculated a coral age of 156±12 ky, which is in good agreement with results obtained on coral coming from the same region by other authors

  15. Innovative Visualization Techniques applied to a Flood Scenario

    Science.gov (United States)

    Falcão, António; Ho, Quan; Lopes, Pedro; Malamud, Bruce D.; Ribeiro, Rita; Jern, Mikael

    2013-04-01

    The large and ever-increasing amounts of multi-dimensional, time-varying and geospatial digital information from multiple sources represent a major challenge for today's analysts. We present a set of visualization techniques that can be used for the interactive analysis of geo-referenced and time sampled data sets, providing an integrated mechanism and that aids the user to collaboratively explore, present and communicate visually complex and dynamic data. Here we present these concepts in the context of a 4 hour flood scenario from Lisbon in 2010, with data that includes measures of water column (flood height) every 10 minutes at a 4.5 m x 4.5 m resolution, topography, building damage, building information, and online base maps. Techniques we use include web-based linked views, multiple charts, map layers and storytelling. We explain two of these in more detail that are not currently in common use for visualization of data: storytelling and web-based linked views. Visual storytelling is a method for providing a guided but interactive process of visualizing data, allowing more engaging data exploration through interactive web-enabled visualizations. Within storytelling, a snapshot mechanism helps the author of a story to highlight data views of particular interest and subsequently share or guide others within the data analysis process. This allows a particular person to select relevant attributes for a snapshot, such as highlighted regions for comparisons, time step, class values for colour legend, etc. and provide a snapshot of the current application state, which can then be provided as a hyperlink and recreated by someone else. Since data can be embedded within this snapshot, it is possible to interactively visualize and manipulate it. The second technique, web-based linked views, includes multiple windows which interactively respond to the user selections, so that when selecting an object and changing it one window, it will automatically update in all the other

  16. Concept analysis of culture applied to nursing.

    Science.gov (United States)

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  17. Digital Fourier analysis advanced techniques

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to advanced digital Fourier analysis for advanced undergraduate and graduate students. Assuming knowledge of the Fast Fourier Transform, this book covers advanced topics including the Hilbert transform, cepstrum analysis, and the two-dimensional Fourier transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Advanced Techniques" includes practice problems and thorough Appendices. As a central feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. The applet source code in Visual Basic is provided online, enabling advanced students to tweak and change the programs for more sophisticated results. A complete, intuitive guide, "Digital Fourier Analysis - Advanced Techniques" is an essential reference for students in science and engineering.

  18. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  19. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    Science.gov (United States)

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  20. Remote sensing techniques applied to seismic vulnerability assessment

    Science.gov (United States)

    Juan Arranz, Jose; Torres, Yolanda; Hahgi, Azade; Gaspar-Escribano, Jorge

    2016-04-01

    Advances in remote sensing and photogrammetry techniques have increased the degree of accuracy and resolution in the record of the earth's surface. This has expanded the range of possible applications of these data. In this research, we have used these data to document the construction characteristics of the urban environment of Lorca, Spain. An exposure database has been created with the gathered information to be used in seismic vulnerability assessment. To this end, we have used data from photogrammetric flights at different periods, using both orthorectified images in the visible and infrared spectrum. Furthermore, the analysis is completed using LiDAR data. From the combination of these data, it has been possible to delineate the building footprints and characterize the constructions with attributes such as the approximate date of construction, area, type of roof and even building materials. To carry out the calculation, we have developed different algorithms to compare images from different times, segment images, classify LiDAR data, and use the infrared data in order to remove vegetation or to compute roof surfaces with height value, tilt and spectral fingerprint. In addition, the accuracy of our results has been validated with ground truth data. Keywords: LiDAR, remote sensing, seismic vulnerability, Lorca

  1. Digital prototyping technique applied for redesigning plastic products

    Science.gov (United States)

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  2. Multicriterial Evaluation of Applying Japanese Management Concepts, Methods and Techniques

    OpenAIRE

    Podobiński, Mateusz

    2014-01-01

    Japanese management concepts, methods and techniques refer to work organization and improvements to companies’ functioning. They appear in numerous Polish companies, especially in the manufacturing ones. Cultural differences are a major impediment in their implementation. Nevertheless, the advantages of using Japanese management concepts, methods and techniques motivate the management to implement them in the company. The author shows research results, which refer to advanta...

  3. Photoacoustic technique applied to the study of skin and leather

    Science.gov (United States)

    Vargas, M.; Varela, J.; Hernández, L.; González, A.

    1998-08-01

    In this paper the photoacoustic technique is used in bull skin for the determination of thermal and optical properties as a function of the tanning process steps. Our results show that the photoacoustic technique is sensitive to the study of physical changes in this kind of material due to the tanning process.

  4. Difficulties applying recent blind source separation techniques to EEG and MEG

    CERN Document Server

    Knuth, Kevin H

    2015-01-01

    High temporal resolution measurements of human brain activity can be performed by recording the electric potentials on the scalp surface (electroencephalography, EEG), or by recording the magnetic fields near the surface of the head (magnetoencephalography, MEG). The analysis of the data is problematic due to the fact that multiple neural generators may be simultaneously active and the potentials and magnetic fields from these sources are superimposed on the detectors. It is highly desirable to un-mix the data into signals representing the behaviors of the original individual generators. This general problem is called blind source separation and several recent techniques utilizing maximum entropy, minimum mutual information, and maximum likelihood estimation have been applied. These techniques have had much success in separating signals such as natural sounds or speech, but appear to be ineffective when applied to EEG or MEG signals. Many of these techniques implicitly assume that the source distributions hav...

  5. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  6. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    Science.gov (United States)

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  7. Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  8. Caldwell University's Department of Applied Behavior Analysis.

    Science.gov (United States)

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis. PMID:27606194

  9. Metamodeling Techniques Applied to the Design of Reconfigurable Control Applications

    Directory of Open Access Journals (Sweden)

    Fogliazza Giuseppe

    2008-01-01

    Full Text Available Abstract In order to realize autonomous manufacturing systems in environments characterized by high dynamics and high complexity of task, it is necessary to improve the control system modelling and performance. This requires the use of better and reusable abstractions. In this paper, we explore the metamodel techniques as a foundation to the solution of this problem. The increasing popularity of model-driven approaches and a new generation of tools to support metamodel techniques are changing software engineering landscape, boosting the adoption of new methodologies for control application development.

  10. Metamodeling Techniques Applied to the Design of Reconfigurable Control Applications

    Directory of Open Access Journals (Sweden)

    Luca Ferrarini

    2008-02-01

    Full Text Available In order to realize autonomous manufacturing systems in environments characterized by high dynamics and high complexity of task, it is necessary to improve the control system modelling and performance. This requires the use of better and reusable abstractions. In this paper, we explore the metamodel techniques as a foundation to the solution of this problem. The increasing popularity of model-driven approaches and a new generation of tools to support metamodel techniques are changing software engineering landscape, boosting the adoption of new methodologies for control application development.

  11. Diagnostic techniques applied in geostatistics for agricultural data analysis Técnicas de diagnóstico utilizadas em geoestatística para análise de dados agrícolas

    Directory of Open Access Journals (Sweden)

    Joelmir André Borssoi

    2009-12-01

    Full Text Available The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.A modelagem da estrutura de dependência espacial pela abordagem da geoestatística é de fundamental importância para a definição de parâmetros que definem essa estrutura e que são utilizados na interpolação de valores em locais não amostrados, pela técnica de krigagem. Entretanto, a estimação de parâmetros pode ser muito alterada pela presença de observações atípicas nos dados amostrados. O desenvolvimento deste trabalho teve por objetivo utilizar técnicas de diagnóstico em modelos espaciais lineares gaussianos, empregados em geoestatística, para avaliar a sensibilidade dos estimadores de máxima verossimilhança e máxima verossimilhança restrita a pequenas perturbações nos dados. Foram realizados estudos de dados simulados e experimentais. O estudo com dados simulados mostrou que as técnicas de diagnóstico foram

  12. Flash radiographic technique applied to fuel injector sprays

    International Nuclear Information System (INIS)

    A flash radiographic technique, using 50 ns exposure times, was used to study the pattern and density distribution of a fuel injector spray. The experimental apparatus and method are described. An 85 kVp flash x-ray generator, designed and fabricated at the Lawrence Livermore Laboratory, is utilized. Radiographic images, recorded on standard x-ray films, are digitized and computer processed

  13. Magnetic Solid Phase Extraction Applied to Food Analysis

    Directory of Open Access Journals (Sweden)

    Israel S. Ibarra

    2015-01-01

    Full Text Available Magnetic solid phase extraction has been used as pretreatment technique for the analysis of several compounds because of its advantages when it is compared with classic methods. This methodology is based on the use of magnetic solids as adsorbents for preconcentration of different analytes from complex matrices. Magnetic solid phase extraction minimizes the use of additional steps such as precipitation, centrifugation, and filtration which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique which were applied in food analysis.

  14. Applying Website Usability Testing Techniques to Promote E-services

    OpenAIRE

    Abdel Nasser H. Zaied; Hassan, Mohamed M.; Islam S. Mohamed

    2015-01-01

    In this competitive world, websites are considered to be a key aspect of any organization’s competitiveness. In addition to visual esthetics, usability of a website is a strong determinant for user’s satisfaction and pleasure. However, lack of appropriate techniques and attributes for measuring usability may constrain the usefulness of a website. To address this issue, we conduct a statistical study to evaluate the usability levels of e-learning and e-training websites based on human (user) p...

  15. Signal Processing Techniques Applied in RFI Mitigation of Radio Astronomy

    OpenAIRE

    Sixiu Wang; Zhengwen Sun; Weixia Wang; Liangquan Jia

    2012-01-01

    Radio broadcast and telecommunications are present at different power levels everywhere on Earth. Radio Frequency Interference (RFI) substantially limits the sensitivity of existing radio telescopes in several frequency bands and may prove to be an even greater obstacle for next generation of telescopes (or arrays) to overcome. A variety of RFI detection and mitigation techniques have been developed in recent years. This study describes various signal process methods of RFI mitigation in radi...

  16. Applying a Splitting Technique to Estimate Electrical Grid Reliability

    OpenAIRE

    Wadman, Wander; Crommelin, Daan; Frank, Jason; Pasupathy, R.; Kim, S.-H.; Tolk, A.; Hill, R; Kuhl, M.E.

    2013-01-01

    As intermittent renewable energy penetrates electrical power grids more and more, assessing grid reliability is of increasing concern for grid operators. Monte Carlo simulation is a robust and popular technique to estimate indices for grid reliability, but the involved computational intensity may be too high for typical reliability analyses. We show that various reliability indices can be expressed as expectations depending on the rare event probability of a so-called power curtailment, and e...

  17. Technology Assessment of Dust Suppression Techniques Applied During Structural Demolition

    Energy Technology Data Exchange (ETDEWEB)

    Boudreaux, J.F.; Ebadian, M.A.; Williams, P.T.; Dua, S.K.

    1998-10-20

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure properly and, at the same time, minimize the amount of dust generated from a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology given site-specific conditions. Thus, the purpose of this research, which was carried out at the Hemispheric Center for Environmental Technology (HCET) at Florida International University, was to conduct an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study targeted the problem of dust suppression during the demolition of nuclear facilities. The resulting data were employed to assist in the development of mathematical correlations that can be applied to predict dust generation during structural demolition.

  18. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  19. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics. PMID:26352604

  20. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  1. Applied Data Analysis in Energy Monitoring System

    Directory of Open Access Journals (Sweden)

    Kychkin А.V.

    2016-08-01

    Full Text Available Software and hardware system organization is presented as an example for building energy monitoring of multi-sectional lighting and climate control / conditioning needs. System key feature is applied office energy data analysis that allows to provide each type of hardware localized work mode recognition. It is based on general energy consumption profile with following energy consumption and workload evaluation. Applied data analysis includes primary data processing block, smoothing filter, time stamp identification block, clusterization and classification blocks, state change detection block, statistical data calculation block. Time slot consumed energy value and slot time stamp are taken as work mode classification main parameters. Energy data applied analysis with HIL and OpenJEVis visualization system usage experimental research results for chosen time period has been provided. Energy consumption, workload calculation and eight different states identification has been executed for two lighting sections and one climate control / conditioning emulating system by integral energy consumption profile. Research has been supported by university internal grant №2016/PI-2 «Methodology development of monitoring and heat flow utilization as low potential company energy sources».

  2. Unconventional Coding Technique Applied to Multi-Level Polarization Modulation

    Science.gov (United States)

    Rutigliano, G. G.; Betti, S.; Perrone, P.

    2016-05-01

    A new technique is proposed to improve information confidentiality in optical-fiber communications without bandwidth consumption. A pseudorandom vectorial sequence was generated by a dynamic system algorithm and used to codify a multi-level polarization modulation based on the Stokes vector. Optical-fiber birefringence, usually considered as a disturbance, was exploited to obfuscate the signal transmission. At the receiver end, the same pseudorandom sequence was generated and used to decode the multi-level polarization modulated signal. The proposed scheme, working at the physical layer, provides strong information security without introducing complex processing and thus latency.

  3. Surgical treatment of scoliosis: a review of techniques currently applied

    Directory of Open Access Journals (Sweden)

    Maruyama Toru

    2008-04-01

    Full Text Available Abstract In this review, basic knowledge and recent innovation of surgical treatment for scoliosis will be described. Surgical treatment for scoliosis is indicated, in general, for the curve exceeding 45 or 50 degrees by the Cobb's method on the ground that: 1 Curves larger than 50 degrees progress even after skeletal maturity. 2 Curves of greater magnitude cause loss of pulmonary function, and much larger curves cause respiratory failure. 3 Larger the curve progress, more difficult to treat with surgery. Posterior fusion with instrumentation has been a standard of the surgical treatment for scoliosis. In modern instrumentation systems, more anchors are used to connect the rod and the spine, resulting in better correction and less frequent implant failures. Segmental pedicle screw constructs or hybrid constructs using pedicle screws, hooks, and wires are the trend of today. Anterior instrumentation surgery had been a choice of treatment for the thoracolumbar and lumbar scoliosis because better correction can be obtained with shorter fusion levels. Recently, superiority of anterior surgery for the thoracolumbar and lumbar scoliosis has been lost. Initial enthusiasm for anterior instrumentation for the thoracic curve using video assisted thoracoscopic surgery technique has faded out. Various attempts are being made with use of fusionless surgery. To control growth, epiphysiodesis on the convex side of the deformity with or without instrumentation is a technique to provide gradual progressive correction and to arrest the deterioration of the curves. To avoid fusion for skeletally immature children with spinal cord injury or myelodysplasia, vertebral wedge ostetomies are performed for the treatment of progressive paralytic scoliosis. For right thoracic curve with idiopathic scoliosis, multiple vertebral wedge osteotomies without fusion are performed. To provide correction and maintain it during the growing years while allowing spinal growth for

  4. Signal Processing Techniques Applied in RFI Mitigation of Radio Astronomy

    Directory of Open Access Journals (Sweden)

    Sixiu Wang

    2012-08-01

    Full Text Available Radio broadcast and telecommunications are present at different power levels everywhere on Earth. Radio Frequency Interference (RFI substantially limits the sensitivity of existing radio telescopes in several frequency bands and may prove to be an even greater obstacle for next generation of telescopes (or arrays to overcome. A variety of RFI detection and mitigation techniques have been developed in recent years. This study describes various signal process methods of RFI mitigation in radio astronomy, choose the method of Time-frequency domain cancellation to eliminate certain interference and effectively improve the signal to noise ratio in pulsar observations. Finally, RFI mitigation researches and implements in China radio astronomy will be also presented.

  5. Discrete filtering techniques applied to sequential GPS range measurements

    Science.gov (United States)

    Vangraas, Frank

    1987-01-01

    The basic navigation solution is described for position and velocity based on range and delta range (Doppler) measurements from NAVSTAR Global Positioning System satellites. The application of discrete filtering techniques is examined to reduce the white noise distortions on the sequential range measurements. A second order (position and velocity states) Kalman filter is implemented to obtain smoothed estimates of range by filtering the dynamics of the signal from each satellite separately. Test results using a simulated GPS receiver show a steady-state noise reduction, the input noise variance divided by the output noise variance, of a factor of four. Recommendations for further noise reduction based on higher order Kalman filters or additional delta range measurements are included.

  6. Applying Business Process Mode ling Techniques : Case Study

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2010-12-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were implemented in practice in recent decades. Most significant of the notations include ARIS, Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contempo-rary bus iness process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, me-thodology of res earch is discussed. The following section presents selected case study results. The paper is concluded with a summary

  7. Quantitative Portfolio Optimization Techniques Applied to the Brazilian Stock Market

    Directory of Open Access Journals (Sweden)

    André Alves Portela Santos

    2012-09-01

    Full Text Available In this paper we assess the out-of-sample performance of two alternative quantitative portfolio optimization techniques - mean-variance and minimum variance optimization – and compare their performance with respect to a naive 1/N (or equally-weighted portfolio and also to the market portfolio given by the Ibovespa. We focus on short selling-constrained portfolios and consider alternative estimators for the covariance matrices: sample covariance matrix, RiskMetrics, and three covariance estimators proposed by Ledoit and Wolf (2003, Ledoit and Wolf (2004a and Ledoit and Wolf (2004b. Taking into account alternative portfolio re-balancing frequencies, we compute out-of-sample performance statistics which indicate that the quantitative approaches delivered improved results in terms of lower portfolio volatility and better risk-adjusted returns. Moreover, the use of more sophisticated estimators for the covariance matrix generated optimal portfolios with lower turnover over time.

  8. A Robust Text Processing Technique Applied to Lexical Error Recovery

    CERN Document Server

    Ingels, P

    1999-01-01

    This thesis addresses automatic lexical error recovery and tokenization of corrupt text input. We propose a technique that can automatically correct misspellings, segmentation errors and real-word errors in a unified framework that uses both a model of language production and a model of the typing behavior, and which makes tokenization part of the recovery process. The typing process is modeled as a noisy channel where Hidden Markov Models are used to model the channel characteristics. Weak statistical language models are used to predict what sentences are likely to be transmitted through the channel. These components are held together in the Token Passing framework which provides the desired tight coupling between orthographic pattern matching and linguistic expectation. The system, CTR (Connected Text Recognition), has been tested on two corpora derived from two different applications, a natural language dialogue system and a transcription typing scenario. Experiments show that CTR can automatically correct...

  9. Applying Data Privacy Techniques on Tabular Data in Uganda

    CERN Document Server

    Mivule, Kato

    2011-01-01

    The growth of Information Technology(IT) in Africa has led to an increase in the utilization of communication networks for data transaction across the continent. A growing number of entities in the private sector, academia, and government, have deployed the Internet as a medium to transact in data, routinely posting statistical and non statistical data online and thereby making many in Africa increasingly dependent on the Internet for data transactions. In the country of Uganda, exponential growth in data transaction has presented a new challenge: What is the most efficient way to implement data privacy. This article discusses data privacy challenges faced by the country of Uganda and implementation of data privacy techniques for published tabular data. We make the case for data privacy, survey concepts of data privacy, and implementations that could be employed to provide data privacy in Uganda.

  10. Object Detection Techniques Applied on Mobile Robot Semantic Navigation

    Directory of Open Access Journals (Sweden)

    Carlos Astua

    2014-04-01

    Full Text Available The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency.

  11. Object detection techniques applied on mobile robot semantic navigation.

    Science.gov (United States)

    Astua, Carlos; Barber, Ramon; Crespo, Jonathan; Jardon, Alberto

    2014-04-11

    The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency.

  12. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    Science.gov (United States)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  13. Technology Assessment of Dust Suppression Techniques applied During Structural Demolition

    Energy Technology Data Exchange (ETDEWEB)

    Boudreaux, J.F.; Ebadian, M.A.; Dua, S.K.

    1997-08-06

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure and, at the same time, minimize the amount of dust generated by a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology. Thus, the purpose of this research, which was conducted by the Hemispheric Center for Environmental Technology (HCET) at Florida International University (FIU), was to perform an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study specifically targeted the problem of dust suppression during demolition. The resulting data were used in the development of mathematical correlations that can be applied to structural demolition. In the Fiscal Year 1996 (FY96), the effectiveness of different dust suppressing agents was investigated for different types of concrete blocks. Initial tests were conducted in a broad particle size range. In Fiscal Year 1997 (FY97), additional tests were performed in the size range in which most of the particles were detected. Since particle distribution is an important parameter for predicting deposition in various compartments of the human respiratory tract, various tests were aimed at determining the particle size distribution of the airborne dust particles. The effectiveness of dust suppressing agents for particles of various size was studied. Instead of conducting experiments on various types of blocks, it was thought prudent to carry out additional tests on blocks of the same type. Several refinements were also incorporated in the test procedures and data acquisition system used in FY96.

  14. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  15. Dust tracking techniques applied to the STARDUST facility: First results

    Energy Technology Data Exchange (ETDEWEB)

    Malizia, A., E-mail: malizia@ing.uniroma2.it [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Camplani, M. [Grupo de Tratamiento de Imágenes, E.T.S.I de Telecomunicación, Universidad Politécnica de Madrid (Spain); Gelfusa, M. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Lupelli, I. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); EURATOM/CCFE Association, Culham Science Centre, Abingdon (United Kingdom); Richetta, M.; Antonelli, L.; Conetta, F.; Scarpellini, D.; Carestia, M.; Peluso, E.; Bellecci, C. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Salgado, L. [Grupo de Tratamiento de Imágenes, E.T.S.I de Telecomunicación, Universidad Politécnica de Madrid (Spain); Video Processing and Understanding Laboratory, Universidad Autónoma de Madrid (Spain); Gaudio, P. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy)

    2014-10-15

    Highlights: •Use of an experimental facility, STARDUST, to analyze the dust resuspension problem inside the tokamak in case of loss of vacuum accident. •PIV technique implementation to track the dust during a LOVA reproduction inside STARDUST. •Data imaging techniques to analyze dust velocity field: first results and data discussion. -- Abstract: An important issue related to future nuclear fusion reactors fueled with deuterium and tritium is the creation of large amounts of dust due to several mechanisms (disruptions, ELMs and VDEs). The dust size expected in nuclear fusion experiments (such as ITER) is in the order of microns (between 0.1 and 1000 μm). Almost the total amount of this dust remains in the vacuum vessel (VV). This radiological dust can re-suspend in case of LOVA (loss of vacuum accident) and these phenomena can cause explosions and serious damages to the health of the operators and to the integrity of the device. The authors have developed a facility, STARDUST, in order to reproduce the thermo fluid-dynamic conditions comparable to those expected inside the VV of the next generation of experiments such as ITER in case of LOVA. The dust used inside the STARDUST facility presents particle sizes and physical characteristics comparable with those that created inside the VV of nuclear fusion experiments. In this facility an experimental campaign has been conducted with the purpose of tracking the dust re-suspended at low pressurization rates (comparable to those expected in case of LOVA in ITER and suggested by the General Safety and Security Report ITER-GSSR) using a fast camera with a frame rate from 1000 to 10,000 images per second. The velocity fields of the mobilized dust are derived from the imaging of a two-dimensional slice of the flow illuminated by optically adapted laser beam. The aim of this work is to demonstrate the possibility of dust tracking by means of image processing with the objective of determining the velocity field values

  16. Optical Trapping Techniques Applied to the Study of Cell Membranes

    Science.gov (United States)

    Morss, Andrew J.

    Optical tweezers allow for manipulating micron-sized objects using pN level optical forces. In this work, we use an optical trapping setup to aid in three separate experiments, all related to the physics of the cellular membrane. In the first experiment, in conjunction with Brian Henslee, we use optical tweezers to allow for precise positioning and control of cells in suspension to evaluate the cell size dependence of electroporation. Theory predicts that all cells porate at a transmembrane potential VTMof roughly 1 V. The Schwann equation predicts that the transmembrane potential depends linearly on the cell radius r, thus predicting that cells should porate at threshold electric fields that go as 1/r. The threshold field required to induce poration is determined by applying a low voltage pulse to the cell and then applying additional pulses of greater and greater magnitude, checking for poration at each step using propidium iodide dye. We find that, contrary to expectations, cells do not porate at a constant value of the transmembrane potential but at a constant value of the electric field which we find to be 692 V/cm for K562 cells. Delivering precise dosages of nanoparticles into cells is of importance for assessing toxicity of nanoparticles or for genetic research. In the second experiment, we conduct nano-electroporation—a novel method of applying precise doses of transfection agents to cells—by using optical tweezers in conjunction with a confocal microscope to manipulate cells into contact with 100 nm wide nanochannels. This work was done in collaboration with Pouyan Boukany of Dr. Lee's group. The small cross sectional area of these nano channels means that the electric field within them is extremely large, 60 MV/m, which allows them to electrophoretically drive transfection agents into the cell. We find that nano electroporation results in excellent dose control (to within 10% in our experiments) compared to bulk electroporation. We also find that

  17. Computer Science Techniques Applied to Parallel Atomistic Simulation

    Science.gov (United States)

    Nakano, Aiichiro

    1998-03-01

    Recent developments in parallel processing technology and multiresolution numerical algorithms have established large-scale molecular dynamics (MD) simulations as a new research mode for studying materials phenomena such as fracture. However, this requires large system sizes and long simulated times. We have developed: i) Space-time multiresolution schemes; ii) fuzzy-clustering approach to hierarchical dynamics; iii) wavelet-based adaptive curvilinear-coordinate load balancing; iv) multilevel preconditioned conjugate gradient method; and v) spacefilling-curve-based data compression for parallel I/O. Using these techniques, million-atom parallel MD simulations are performed for the oxidation dynamics of nanocrystalline Al. The simulations take into account the effect of dynamic charge transfer between Al and O using the electronegativity equalization scheme. The resulting long-range Coulomb interaction is calculated efficiently with the fast multipole method. Results for temperature and charge distributions, residual stresses, bond lengths and bond angles, and diffusivities of Al and O will be presented. The oxidation of nanocrystalline Al is elucidated through immersive visualization in virtual environments. A unique dual-degree education program at Louisiana State University will also be discussed in which students can obtain a Ph.D. in Physics & Astronomy and a M.S. from the Department of Computer Science in five years. This program fosters interdisciplinary research activities for interfacing High Performance Computing and Communications with large-scale atomistic simulations of advanced materials. This work was supported by NSF (CAREER Program), ARO, PRF, and Louisiana LEQSF.

  18. Applying Website Usability Testing Techniques to Promote E-services

    Directory of Open Access Journals (Sweden)

    Abdel Nasser H. Zaied

    2015-09-01

    Full Text Available In this competitive world, websites are considered to be a key aspect of any organization’s competitiveness. In addition to visual esthetics, usability of a website is a strong determinant for user’s satisfaction and pleasure. However, lack of appropriate techniques and attributes for measuring usability may constrain the usefulness of a website. To address this issue, we conduct a statistical study to evaluate the usability levels of e-learning and e-training websites based on human (user perception. The questionnaire is implemented as user based tool, visitors of a website can use it to evaluate the usability of the websites. The results showed that according to the students’ point view the personalization has the first important criterion for the use of the e-learning websites, while according to experts’ point view the accessibility has the first important criterion for the use of the e-learning websites. Also the result indicated that the experienced respondents have demonstrated satisfaction over the usability attributes of e-learning websites they accessed for their learning purposes; while inexperienced students have expressed their perception on the importance of the usability attributes for accessing e-learning websites. When combining and comparing both finings, based on the outcomes it is evident that, all the attributes yielded satisfaction and were felt important.

  19. Comparison between different techniques applied to quartz CPO determination in granitoid mylonites

    Science.gov (United States)

    Fazio, Eugenio; Punturo, Rosalda; Cirrincione, Rosolino; Kern, Hartmut; Wenk, Hans-Rudolph; Pezzino, Antonino; Goswami, Shalini; Mamtani, Manish

    2016-04-01

    Since the second half of the last century, several techniques have been adopted to resolve the crystallographic preferred orientation (CPO) of major minerals constituting crustal and mantle rocks. To this aim, many efforts have been made to increase the accuracy of such analytical devices as well as to progressively reduce the time needed to perform microstructural analysis. It is worth noting that many of these microstructural studies deal with quartz CPO because of the wide occurrence of this mineral phase in crustal rocks as well as its quite simple chemical composition. In the present work, four different techniques were applied to define CPOs of dynamically recrystallized quartz domains from naturally deformed rocks collected from a ductile crustal scale shear zone in order to compare their advantages and limitation. The selected Alpine shear zone is located in the Aspromonte Massif (Calabrian Peloritani Orogen, southern Italy) representing granitoid lithotypes. The adopted methods span from "classical" universal stage (US), to image analysis technique (CIP), electron back-scattered diffraction (EBSD), and time of flight neutron diffraction (TOF). When compared, bulk texture pole figures obtained by means of these different techniques show a good correlation. Advances in analytical techniques used for microstructural investigations are outlined by discussing results of quartz CPO that are presented in this study.

  20. Applying traditional signal processing techniques to social media exploitation for situational understanding

    Science.gov (United States)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  1. Sterile insect technique applied to Queensland fruit fly

    International Nuclear Information System (INIS)

    The Sterile Insect Technique (SIT) aims to suppress or eradicate pest populations by flooding wild populations with sterile males. To control fruit fly million of flies of both sexes are mass reared at the Gosford Post-Harvest laboratory near Sydney, mixed with sawdust and fluorescent dye at the pupal stage and transported to Ansto where they are exposed to low dose of 70-75Gy of gamma radiation from a Cobalt-60 source. Following irradiation the pupae are transported to the release site in plastic sleeves then transferred to large plastic garbage bins for hatching. These bins are held at 30 deg. C. to synchronise hatching and files are released 48-72 hours after hatching begins. In most cases these bins are placed among fruit trees in the form of an 800 metre grid. This maximises survival of the emerging flies which are released on an almost daily basis. Progress of the SIT program is monitored by collecting flies from traps dotted all over the infested site. The ratio of sterile to wild flies can be detected because the sterile files are coated with the fluorescent dust which can be seen under ultra-violet light. If the SIT program is successful entomologists will trap a high proportion of sterile flies to wild flies and this should result in a clear reduction in maggot infestations. Surveillance, quarantine, and trapping activities continue for 8 or 9 months to check for any surviving pockets of infestation. If any are found the SIT program is reactivated. These programs demonstrated that SIT was an efficient and environmental friendly non-chemical control method for eradicating outbreaks or suppressing fruit fly populations in important fruit growing areas. ills

  2. Applying perceptual and adaptive learning techniques for teaching introductory histopathology

    Directory of Open Access Journals (Sweden)

    Sally Krasne

    2013-01-01

    Full Text Available Background: Medical students are expected to master the ability to interpret histopathologic images, a difficult and time-consuming process. A major problem is the issue of transferring information learned from one example of a particular pathology to a new example. Recent advances in cognitive science have identified new approaches to address this problem. Methods: We adapted a new approach for enhancing pattern recognition of basic pathologic processes in skin histopathology images that utilizes perceptual learning techniques, allowing learners to see relevant structure in novel cases along with adaptive learning algorithms that space and sequence different categories (e.g. diagnoses that appear during a learning session based on each learner′s accuracy and response time (RT. We developed a perceptual and adaptive learning module (PALM that utilized 261 unique images of cell injury, inflammation, neoplasia, or normal histology at low and high magnification. Accuracy and RT were tracked and integrated into a "Score" that reflected students rapid recognition of the pathologies and pre- and post-tests were given to assess the effectiveness. Results: Accuracy, RT and Scores significantly improved from the pre- to post-test with Scores showing much greater improvement than accuracy alone. Delayed post-tests with previously unseen cases, given after 6-7 weeks, showed a decline in accuracy relative to the post-test for 1 st -year students, but not significantly so for 2 nd -year students. However, the delayed post-test scores maintained a significant and large improvement relative to those of the pre-test for both 1 st and 2 nd year students suggesting good retention of pattern recognition. Student evaluations were very favorable. Conclusion: A web-based learning module based on the principles of cognitive science showed an evidence for improved recognition of histopathology patterns by medical students.

  3. Wavelet analysis applied to the IRAS cirrus

    Science.gov (United States)

    Langer, William D.; Wilson, Robert W.; Anderson, Charles H.

    1994-01-01

    The structure of infrared cirrus clouds is analyzed with Laplacian pyramid transforms, a form of non-orthogonal wavelets. Pyramid and wavelet transforms provide a means to decompose images into their spatial frequency components such that all spatial scales are treated in an equivalent manner. The multiscale transform analysis is applied to IRAS 100 micrometer maps of cirrus emission in the north Galactic pole region to extract features on different scales. In the maps we identify filaments, fragments and clumps by separating all connected regions. These structures are analyzed with respect to their Hausdorff dimension for evidence of the scaling relationships in the cirrus clouds.

  4. Thermal analysis applied to irradiated propolis

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; Mastro, N.L. del E-mail: nelida@usp.br

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were {sup 60}Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600 deg. C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  5. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  6. Improvement of dosimetry techniques applied in thermoluminescence dating

    International Nuclear Information System (INIS)

    This work deals with the determination of natural dose rates necessary in thermoluminescence (TL) dating method. The rates are low and linked to various types of radiation: alpha, beta, gamma, and cosmics generated by the natural sources. For an accurate estimation of dose we carry out a calibration of reference sources we use. Then a detailed study of alpha counting by scintillation is carried out. This method has been improved and calibrated with well-known composition samples. The sources of error are detailed: overcounting, radon loss.. The absorption coefficients of beta dose (used in the method of quartz inclusions) calculated by Mejdhal have been recalculated with an attempt of experimental evaluation. In order to improve the measurement of gamma dose we use analytic and direct methods: TL dosimeter, gamma spectrometer in situ, physico-chimical analysis. This impled the use of a new thermoluminescence dosimeter, alumina doped with carbone, and the accurate estimate of the gammameter calibration corrections. A critical comparison is carried out on a dose, well-known reference sites. Results by independent methods allow to estimate the annual dose with an incertitude about 2%. The enclosure method consists in mixing up the powdered sample with TL dosimeter has been tested with insensitized to external alpha particles. This simulation allowed to measure annual dose without any correction factors. A satisfactory agreement has been noted between quite independent technics. (orig.)

  7. Radioanalytical techniques applied to environmental chemistry: A two case study

    International Nuclear Information System (INIS)

    Collection of atmospheric aerosol began at Mauna Loa Observatory in February of 1979. A sector controller was installed in May of 1980. Weekly samples were collected. One half of the particle filters were analyzed by instrumental neutron activation analysis (INAA) for elemental quantitation and the other half was analyzed by ion chromatography (IC) for sulfate. Sampling was later modified to include four base treated filters sequentially. One half of the particle filter was still analyzed by INAA and one quarter was analyzed by IC for anions of chloride, nitrate and sulfate. One half of the four base treated filters was also analyzed by IC. This yields a twelve year data record of weekly elemental aerosol concentrations. The most dominant source of trace metals to the 3400 m MLO site is crustal material. The large dust storms in the Chinese deserts loft immense amounts of crustal material into the troposphere every year. Other sources for aerosol particles in the free troposphere are marine and anthropogenic. Anthropogenic sources tend to contribute less than marine sources overall. One pollution episode in 1989 perturbed the vanadium budget by three orders of magnitude. Acidic gaseous nitrogen species and particulate nitrate both follow a clear seasonal trend, which is not highly correlated to the Asian dust episodes. Acidic gaseous chloride and particulate chloride together yield a marine enrichment factor near one, implying that the source for the gaseous chloride is sea salt, which has reacted with the H2SO2 in atmosphere to release HCl gas. Most of the sulfate found exists as a particle rather than in an acidic gaseous sulfur species. First prompt gamma-ray spectroscopy is discussed. A preliminary study of sewage sludge was done at the Los Alamos National Lab.'s PGRS system. The results are presented with projected detection limits of various elements relative to hydrogen

  8. Radioanalytical Techniques Applied to Environmental Chemistry: a Two Case Study.

    Science.gov (United States)

    Holmes, Jennifer L.

    Case I. Collection of atmospheric aerosol began at Mauna Loa Observatory in February of 1979. A sector controller was installed in May of 1980. Weekly samples were collected. One half of the particle filters were analyzed by instrumental neutron activation analysis (INAA) for elemental quantitation and the other half was analyzed by ion chromatography (IC) for sulfate. Sampling was later modified to include four base treated filters sequentially. One half of the particle filter was still analyzed by INAA and one quarter was analyzed by IC for anions of chloride, nitrate and sulfate. One half of the four base treated filters was also analyzed by IC. This yields a twelve year data record of weekly elemental aerosol concentrations. The most dominant source of trace metals to the 3400 m MLO site is crustal material. The large dust storms in the Chinese deserts loft immense amounts of crustal material into the troposphere every year. A typical elemental signature for Asian dust at MLO is determined. Other sources for aerosol particles in the free troposphere are marine and anthropogenic. Anthropogenic sources tend to contribute less than marine sources overall. One pollution episode in 1989 perturbed the vanadium budget by nearly three orders of magnitude. Also crustal excess values for several elements are modeled poorly with a local Hawaiian Basalt signature. Acidic gaseous nitrogen species and particulate nitrate both follow a clear seasonal trend, which is not highly correlated to the Asian dust episodes. Acidic gaseous chloride and particulate chloride together yield a marine enrichment factor near one, implying that the source for the gaseous chloride is sea salt, which has reacted with the rm H_2SO_4 in atmosphere to release HCl gas. Most of the sulfate found exists as a particle rather than in an acidic gaseous sulfur species. Case II. First prompt gamma-ray spectroscopy is discussed. A preliminary study of sewage sludge was done at the Los Alamos National

  9. Guidelines for depth data collection in rivers when applying interpolation techniques (kriging for river restoration

    Directory of Open Access Journals (Sweden)

    M. Rivas-Casado

    2007-05-01

    Full Text Available River restoration appraisal requires the implementation of monitoring programmes that assess the river site before and after the restoration project. However, little work has yet been developed to design effective and efficient sampling strategies. Three main variables need to be considered when designing monitoring programmes: space, time and scale. The aim of this paper is to describe the methodology applied to analyse the variation of depth in space, scale and time so more comprehensive monitoring programmes can be developed. Geostatistical techniques were applied to study the spatial dimension (sampling strategy and density, spectral analysis was used to study the scale at which depth shows cyclic patterns, whilst descriptive statistics were used to assess the temporal variation. A brief set of guidelines have been summarised in the conclusion.

  10. Algorithms Design Techniques and Analysis

    CERN Document Server

    Alsuwaiyel, M H

    1999-01-01

    Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm desi

  11. Grid-based Moment Tensor Inversion Technique Apply for Earthquakes Offshore of Northeast Taiwan

    Science.gov (United States)

    Cheng, H.; Lee, S.; Ma, K.

    2010-12-01

    We use a grid-based moment tensor inversion technique and broadband continuous recordings to real-time monitoring the earthquakes offshore northeast Taiwan. The moment tensor inversion technique and a grid search scheme are applied to obtain the information of source parameters, including the hypocenter, moment magnitude, and focal mechanism. In Taiwan, the routine moment tensor solutions are reported by CWB(Central Weather Bureau) and BATS(Broadband Array in Taiwan for Seismology) which both require some lag time for the information on event time and location before doing CMT(Centroid Moment Tensor) analysis. By using the Grid-based moment tensor inversion technique, the event location and focal mechanism could be obtained simultaneously within about two minutes after the occurrence of the earthquake. This inversion procedure is based on a 1-D Green’s functions database calculated by frequency-wavenumber(fk) method. The northeast offshore of Taiwan has been taken into account as our first test area which covers the region of 121.5E to 123E, 23.5N to 25N, and the depth to 136 km. A 3D grid system is set in this study area with average grid size of 10 x 10 x 10 km3. We compare our results with the past earthquakes from 2008 to 2010 which had analyzed by BATS CMT. We also compare the event time detected by GridMT with the CWB earthquake reports. The results indicate that the grid-based moment tensor inversion system is efficient and realizable to be applied real-time on monitoring the local seismic activity. Our long-term goal is to use the GridMT technique with fully 3-D Green’s functions for the whole Taiwan in the future.

  12. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  13. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  14. Nuclear and conventional techniques applied to the analysis of prehispanic metals of the Templo Mayor of Tenochtitlan; Tecnicas nucleares y convencionales aplicadas al analisis de metales prehispanicos del Templo Mayor de Tenochtitlan

    Energy Technology Data Exchange (ETDEWEB)

    Mendez M, U

    2003-07-01

    The use of the such experimental techniques as: PIXE, RBS, Metallography and Sem, applied to the characterization of pre hispanic metals of copper and gold coming from 9 offerings of the Templo Mayor of Tenochtitlan, are possible to obtain results and information sustained on such aspects as technological development and cultural and commercial exchange besides a relative chronology, as well as aspects related with conservation, authenticity, symbolic association and social meaning of the offerings. After way but it specifies, it will be given to know each one of the objectives outlined for this study: To carry out interpretations on technical of factory, stylistic designs and cultural and commercial exchanges starting from aspects like: microstructure, elementary composition, type of alloys, welding existence, golden superficial, and conservation, they can be had. To determine the technological advance that means the prosecution of the metallic materials and to know their location in the archaeological context, as a means for the interpretation of the social significance of the offering. To know the possible association symbolic-religious from the metallic objects offering to the deities; starting from significant characteristics as they are: color, forms and function. To establish if it is possible to know if the devices found in the offerings are of the same temporality in which one carries out this, or at least, to locate to the devices inside the two stages of the development of the metallurgy these they are known as the period of the native copper and the period of the alloys, this helped to determine a relative chronology of when the objects were manufactured. To confirm the authenticity of the devices. To determine, in a way specifies, the conservation grade in that they are the pieces. To corroborate some of the manufacture processes This is achieved by means of the reproduction of objects in laboratory, to establish comparisons and differences among pre

  15. Applied research on air pollution using nuclear-related analytical techniques

    International Nuclear Information System (INIS)

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which will run from 1992-1996, and will build upon the experience gained by the Agency from the laboratory support that it has been providing for several years to BAPMoN - the Background Air Pollution Monitoring Network programme organized under the auspices of the World Meterological Organization. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XFR, and PIXE for the analysis of toxic and other trace elements in suspended particulate matter (including air filter samples), rainwater and fog-water samples, and in biological indicators of air pollution (e.g. lichens and mosses). The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for practically-oriented research and monitoring studies on air pollution ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural areas). This document reports the discussions held during the first Research Co-ordination Meeting (RCM) for the CRP which took place at the IAEA Headquarters in Vienna. Refs, figs and tabs

  16. Operations research techniques applied to service center logistics in power distribution users

    Directory of Open Access Journals (Sweden)

    Maria Teresinha Arns Steiner

    2006-12-01

    Full Text Available This paper deals with the optimization for the logistics regarding services demanded byusers of power distribution lines, served by the Portão office, located in Curitiba, PR, Brazil,and operated by COPEL (Paranaense Power Company. Through the use of OperationsResearch techniques, an Integer Programming Mathematical model and Floyd Algorithm, amethod was defined to determine in an optimized way, the number of teams needed by theselected office, as well as, the optimized assignment for the teams to the sites in need, inorder to offer efficient services to the users and, besides that, the immediate execution onemergencies and, as to the other services, accordingly to parameters set by the NationalPower Agency together with COPEL. The methodology hereby presented is generic, so thatit could be applied to any power network (or any of its lines, and it has presented verysatisfactory results to the case in analysis.

  17. Colilert® applied to food analysis

    Directory of Open Access Journals (Sweden)

    Maria José Rodrigues

    2014-06-01

    Full Text Available Colilert® (IDEXX was originally developed for the simultaneous enumeration of coliforms and E. coli in water samples and has been used for the quality control routine of drinking, swimming pools, fresh, coastal and waste waters (Grossi et al., 2013. The Colilert® culture medium contains the indicator nutrient 4-Methylumbelliferyl-β-D-Glucuronide (MUG. MUG acts as a substrate for the E. coli enzyme β-glucuronidase, from which a fluorescent compound is produced. A positive MUG result produces fluorescence when viewed under an ultraviolet lamp. If the test fluorescence is equal to or greater than that of the control, the presence of E. coli has been confirmed (Lopez-Roldan et al., 2013. The present work aimed to apply Colilert® to the enumeration of E. coli in different foods, through the comparison of results against the reference method (ISO 16649-2, 2001 for E. coli food analysis. The study was divided in two stages. During the first stage ten different types of foods were analyzed with Colilert®, these included pastry, raw meat, ready to eat meals, yogurt, raw seabream and salmon, and cooked shrimp. From these it were approved the following: pastry with custard; raw minced pork; soup "caldo-verde"; raw vegetable salad (lettuce and carrots and solid yogurt. The approved foods presented a better insertion in the tray, the colour of the wells was lighter and the UV reading was easier. In the second stage the foods were artificially contaminated with 2 log/g of E. coli (ATCC 25922 and analyzed. Colilert® proved to be an accurate method and the counts were similar to the ones obtained with the reference method. In the present study, the Colilert® method did not reveal neither false-positive or false-negative results, however sometimes the results were difficult to read due to the presence of green fluorescence in some wells. Generally Colilert® was an easy and rapid method, but less objective and more expensive than the reference method.

  18. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  19. Applying Analytical Derivative and Sparse Matrix Techniques to Large-Scale Process Optimization Problems

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The performance of analytical derivative and sparse matrix techniques applied to a traditional densesequential quadratic programming(SQP) is studied, and the strategy utilizing those techniques is also presented. Computational results on two typicalchemical optimization problems demonstrate significant enhancement in efficiency, which shows this strategy ispromising and suitable for large-scale process optimization problems.

  20. Applying Analytical Derivative and Sparse Matrix Techniques to Large-Scale Process Optimization Problems

    Institute of Scientific and Technical Information of China (English)

    钟卫涛; 邵之江; 张余岳; 钱积新

    2000-01-01

    The performance of analytical derivative and sparse matrix techniques applied to a traditional dense sequential quadratic programming (SQP) is studied, and the strategy utilizing those techniques is also presented.Computational results on two typical chemical optimization problems demonstrate significant enhancement in efficiency, which shows this strategy is promising and suitable for large-scale process optimization problems.

  1. Applying BI Techniques To Improve Decision Making And Provide Knowledge Based Management

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2015-07-01

    Full Text Available The paper focuses on BI techniques and especially data mining algorithms that can support and improve the decision making process, with applications within the financial sector. We consider the data mining techniques to be more efficient and thus we applied several techniques, supervised and unsupervised learning algorithms The case study in which these algorithms have been implemented regards the activity of a banking institution, with focus on the management of lending activities.

  2. Liver Ultrasound Image Analysis using Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Smriti Sahu, Maheedhar Dubey, Mohammad Imroze Khan

    2012-12-01

    Full Text Available Liver cancer is the sixth most common malignanttumour and the third most common cause ofcancer-related deaths worldwide. Chronic Liverdamage affects up to 20% of our population. It hasmany causes - viral infections (Hepatitis B and C,toxins, genetic, metabolic and autoimmune diseases.The rate of liver cancer in Australia has increasedfour-fold in the past 20 years. For detection andqualitative diagnosis of liver diseases, Ultrasound(US image is an easy-to-use and minimally invasiveimaging modality. Medical images are oftendeteriorated by noise due to various sources ofinterferences and other phenomena known asSpeckle noise. Therefore it is required to apply somedigital image processing techniques for smoothingor suppression of speckle noise in ultrasoundimages. This paper attempts to undertake the studythree types of the image enhancement techniquesincluding, Shock Filter, Contrast Limited AdaptiveHistogram Equalization (CLAHE and Spatialfilter. These smoothing techniques are comparedusing performance matrices Peak Signal to NoiseRatio (PSNR and Mean Square Error (MSE. Ithas been observed that the Spatial high pass filtergives the better performance than others for liverultrasound image analysis.

  3. Window technique for climate trend analysis

    Science.gov (United States)

    Szentimrey, Tamás; Faragó, Tibor; Szalai, Sándor

    1992-01-01

    Climatic characteristics are affected by various systematic and occasional impacts: besides the changes in the observing system (locations of the stations of the meteorological network, instruments, observing procedures), the possible local-scale and global natural and antropogenic impacts on climatic conditions should be taken into account. Apart from the predictability problems, the phenomenological analysis of the climatic variability and the determination of past persistent climatic anomalies are significant problems, among other aspects, as evidence of the possible anomalous behavior of climate or for climate impact studies. In this paper, a special technique for the identification of such “shifts” in the observational series is presented. The existence of these significant shorter or longer term changes in the mean characteristics for the properly selected adjoining periods of time is the necessary condition for the formation of any more or less unidirectional climatic trends. Actually, the window technique is based on a complete set of orthogonal functions. The sensitivity of the proposed model on its main parameters is also investigated. This method is applied for hemispheric and Hungarian data series of the mean annual surface temperature.

  4. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  5. Functional analysis in modern applied mathematics

    CERN Document Server

    Curtain, Ruth F

    1977-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  6. Micro analysis of disolved gases by the gas chromatography technique

    International Nuclear Information System (INIS)

    A technique which allows the quantitative analysis of small concentration of disolved gases such as CO2 and H2 in the order of 10-6 - 10-3M is discussed. For the extraction, separation and quantification a Toepler pump was used. This is in tandem to a gas chromatography. This method also can be applied for the analysis of other gases like CO, CH4, CH3-CH3 etc. This technique may be applied in fields such as radiation chemistry, oceanography and environmental studies. (author)

  7. Applied modal analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Pedersen, H.B.; Kristensen, O.J.D.

    2003-01-01

    are investigated and the most suitable are chosen. Different excitation techniques are tried during experimental campaigns. After a discussion the pendulum hammer were chosen, and a new improved hammer wasmanufactured. Some measurement errors are investigated. The ability to repeat the measured results...

  8. Introduction: Conversation Analysis in Applied Linguistics

    Science.gov (United States)

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  9. Research on key techniques of virtual reality applied in mining industry

    Institute of Scientific and Technical Information of China (English)

    LIAO Jun; LU Guo-bin

    2009-01-01

    Based on the applications of virtual reality technology in many fields, introduced the virtual reality technical basic concept, structure type, related technique development, etc., tallied up applications of virtual reality technique in the present mining industry, inquired into core techniques related software and hardware, especially the optimization in the setup of various 3D models technique, and carried out a virtual scene to travel extensively in real-time by stereoscopic manifestation technique and so on. Then it brought forward the solution of virtual reality technique with software and hardware to the mining industry that can satisfy the demand of different aspects and levers. Finally, it show a fine prospect of virtual reality technique applied in the mining industry.

  10. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  11. Comparison of Commonly Used Accident Analysis Techniques for Manufacturing Industries

    Directory of Open Access Journals (Sweden)

    IRAJ MOHAMMADFAM

    2015-10-01

    Full Text Available The adverse consequences of major accident events have led to development of accident analysis techniques to investigate thoroughly the accidents. However, each technique has its own advantages and shortcomings,which make it very difficult to find a single technique being capable of analyzing all types of accidents. Therefore, the comparison of accident analysis techniques would help finding out their capabilities in different circumstances to choose the most one. In this research, the techniques CBA and AABF were compared with Tripod β in order to determine the superior technique for analysis of major accidents in manufacturing industries. At first step, the comparison criteria were developed using Delphi Method. Afterwards, the relative importance of each criterion was qualitatively determined and the qualitative values were then converted to the quantitative values  applying  Fuzzy  triangular  numbers.  Finally,  the  TOPSIS  was  used  to  prioritize  the techniques in terms of the preset criteria. The results of the study showed that Tripod β is superior to the CBA and AABF. It is highly recommended to compare all available accident analysis techniques based on proper criteria in order to select the best one whereas improper choice of accident analysis techniques may lead to misguided results.

  12. Spatial analysis methodology applied to rural electrification

    Energy Technology Data Exchange (ETDEWEB)

    Amador, J. [Department of Electric Engineering, EUTI, UPM, Ronda de Valencia, E-28012 Madrid (Spain); Dominguez, J. [Renewable Energies Division, CIEMAT, Av. Complutense 22, E-28040 Madrid (Spain)

    2006-08-15

    The use of geographical information systems (GISs) in studies of regional integration of renewable energies provides advantages such as speed, amount of information, analysis capacity and others. However, these characteristics make it difficult to link the results to the initial variables, and therefore to validate the GIS. This makes it hard to ascertain the reliability of both the results and their subsequent analysis. To solve these problems, a GIS-based method is proposed with renewable energies for rural electrification structured in three stages, with the aim of finding out the influence of the initial variables on the result. In the first stage, a classic sensitivity analysis of the equivalent electrification cost (LEC) is performed; the second stage involves a spatial sensitivity analysis and the third determines the stability of the results. This methodology has been verified in the application of a GIS in Lorca (Spain). (author)

  13. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    the worlds of statistics and chemometrics. We want to provide a glimpse of the essential and complex data pre-processing that is well known to chemometricians, but is generally unknown to statisticians. Pre-processing can potentially have a strong in uence on the results of consequent data analysis. Our......In this thesis we explore the use of functional data analysis as a method to analyse chemometric data, more specically spectral data in metabolomics. Functional data analysis is a vibrant eld in statistics. It has been rapidly expanding in both methodology and applications since it was made well...... known by Ramsay & Silverman's monograph in 1997. In functional data analysis, the data are curves instead of data points. Each curve is measured at discrete points along a continuum, for example, time or frequency. It is assumed that the underlying process generating the curves is smooth...

  14. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  15. Applied bioinformatics: Genome annotation and transcriptome analysis

    DEFF Research Database (Denmark)

    Gupta, Vikas

    Next generation sequencing (NGS) has revolutionized the field of genomics and its wide range of applications has resulted in the genome-wide analysis of hundreds of species and the development of thousands of computational tools. This thesis represents my work on NGS analysis of four species, Lotus...... japonicus (Lotus), Vaccinium corymbosum (blueberry), Stegodyphus mimosarum (spider) and Trifolium occidentale (clover). From a bioinformatics data analysis perspective, my work can be divided into three parts; genome annotation, small RNA, and gene expression analysis. Lotus is a legume of significant...... agricultural and biological importance. Its capacity to form symbiotic relationships with rhizobia and microrrhizal fungi has fascinated researchers for years. Lotus has a small genome of approximately 470 Mb and a short life cycle of 2 to 3 months, which has made Lotus a model legume plant for many molecular...

  16. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    Energy Technology Data Exchange (ETDEWEB)

    Palit, Mousumi [Department of Electronics and Telecommunication Engineering, Central Calcutta Polytechnic, Kolkata 700014 (India); Tudu, Bipan, E-mail: bt@iee.jusl.ac.in [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Bhattacharyya, Nabarun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Dutta, Ankur; Dutta, Pallab Kumar [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Jana, Arun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Bandyopadhyay, Rajib [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Chatterjee, Anutosh [Department of Electronics and Communication Engineering, Heritage Institute of Technology, Kolkata 700107 (India)

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  17. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea.

    Science.gov (United States)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  18. Nuclear and conventional techniques applied to the analysis of Purhepecha metals of the Pareyon collection; Tecnicas nucleares y convencionales aplicadas al analisis de metales Purhepecha de la coleccion Pareyon

    Energy Technology Data Exchange (ETDEWEB)

    Mendez, U.; Tenorio C, D. [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico); Ruvalcaba, J.L. [IFUNAM, 04510 Mexico D.F. (Mexico); Lopez, J.A. [Instituto Nacional de Antropologia e Historia, 11000 Mexico D.F. (Mexico)

    2005-07-01

    The main objective of this investigation was to determine the composition and microstructure of 13 metallic devices by means of the nuclear techniques of PIXE, RBS and conventional; which were elaborated starting from copper and gold, and they were in the offering of a tarasc personage located in the 'Matamoros' porch in Uruapan, Michoacan, Mexico. (Author)

  19. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  20. Poof Analysis: A technique for Concept Formation

    OpenAIRE

    Bundy, Alan

    1985-01-01

    We report the discovery of an unexpected connection between the invention of the concept of uniform convergence and the occurs check in the unification algorithm. This discovery suggests the invention of further interesting concepts in analysis and a technique for automated concept formation. Part of this technique has been implemented.The discovery arose as part of an attempt to understand the role of proof analysis in mathematical reasoning, so as to incorporate it into a computer program. ...

  1. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  2. Applying centrality measures to impact analysis: A coauthorship network analysis

    CERN Document Server

    Yan, Erjia

    2010-01-01

    Many studies on coauthorship networks focus on network topology and network statistical mechanics. This article takes a different approach by studying micro-level network properties, with the aim to apply centrality measures to impact analysis. Using coauthorship data from 16 journals in the field of library and information science (LIS) with a time span of twenty years (1988-2007), we construct an evolving coauthorship network and calculate four centrality measures (closeness, betweenness, degree and PageRank) for authors in this network. We find out that the four centrality measures are significantly correlated with citation counts. We also discuss the usability of centrality measures in author ranking, and suggest that centrality measures can be useful indicators for impact analysis.

  3. Development of Promising Insulating Oil and Applied Techniques of EHD, ER·MR

    Science.gov (United States)

    Hanaoka, Ryoichi

    The development of an environment-friendly insulating liquid has been noticed for a new design of oil-filled power apparatus such as transformer from viewpoints of the protection of the environment. The dielectric liquids can also widely be applied to various fields which are concerned in the electromagnetic field. This article introduces the recent trend on promising new vegetable based oil as an electrical insulation, and EHD pumping, ER fluid and MR fluid as the applied techniques of dielectric liquids.

  4. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun

    2000-05-01

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques.

  5. Nuclear techniques (PIXE and RBS) applied to analysis of pre hispanic metals of the Templo Mayor at Tenochtitlan; Tecnicas nucleares (PIXE y RBS) aplicadas al analisis de metales prehispanicos del Templo Mayor de Tenochtitlan

    Energy Technology Data Exchange (ETDEWEB)

    Mendez U, I.; Tenorio, D.; Galvan, J.L. [Instituto Nacional de Investigaciones Nucleares, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2000-07-01

    This work has the objective of determining by means of the utilization of nuclear techniques (PIXE and RBS) the composition and the alloy type of diverse aztec ornaments corresponding to Post classic period, they were manufactured principally with copper and gold such as bells, beads and disks; all they belonging at 9 oblations of Templo Mayor of Tenochtitlan. It is presented here briefly the historical and archaeological antecedents of the devices as well as the analytical methods for conclude with the results obtained. (Author)

  6. Strategies and techniques of communication and public relations applied to non-profit sector

    Directory of Open Access Journals (Sweden)

    Ioana – Julieta Josan

    2010-05-01

    Full Text Available The aim of this paper is to summarize the strategies and techniques of communication and public relations applied to non-profit sector.The approach of the paper is to identify the most appropriate strategies and techniques that non-profit sector can use to accomplish its objectives, to highlight specific differences between the strategies and techniques of the profit and non-profit sectors and to identify potential communication and public relations actions in order to increase visibility among target audience, create brand awareness and to change into positive brand sentiment the target perception about the non-profit sector.

  7. Improvement of vibration and noise by applying analysis technology. Development of active control technique of engine noise in a car cabin. Kaiseki gijutsu wo oyoshita shindo-soon no kaizen. Shashitsunai engine soon akutibu seigyo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Uchida, H.; Nakao, N.; Butsuen, T. (Matsuda Motor Corp., Hiroshima (Japan). Technology Research Inst.)

    1994-06-01

    It is difficult to reduce engine noise which is principal noise in a car cabin without producing an adverse effect on low cost production. Active noise control technique (ANC) has been developed to reduce engine noise compatible with low cost production. This paper discusses its control algorithm and the system configuration and presents experimental results. The filtered-x least mean square method is a well-known ANC algorithm, however, it often requires large amount of calculation exceeding the present capacity of a digital signal processor. An effective ANC algorithm is developed by the use of the repetitiveness of the engine noise. This paper describes the basic theory of the control algorithm, the extension to a multiple input and output system, the system configuration and experimental results. A noise control system with three microphones is designed with consideration of the spatial distribution of the noise and reduces noise in the whole cabin by 8dB(A) in the largest case. Active noise control technique is applicable to many areas and can be used for the reduction of noise and vibration other than engine noise. 5 refs., 7 figs., 1 tab.

  8. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  9. Zero order and signal processing spectrophotometric techniques applied for resolving interference of metronidazole with ciprofloxacin in their pharmaceutical dosage form.

    Science.gov (United States)

    Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed

    2016-02-01

    Four rapid, simple, accurate and precise spectrophotometric methods were used for the determination of ciprofloxacin in the presence of metronidazole as interference. The methods under study are area under the curve, simultaneous equation in addition to smart signal processing techniques of manipulating ratio spectra namely Savitsky-Golay filters and continuous wavelet transform. All the methods were validated according to the ICH guidelines where accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can therefore be used for the routine analysis of ciprofloxacin in quality-control laboratories.

  10. Zero order and signal processing spectrophotometric techniques applied for resolving interference of metronidazole with ciprofloxacin in their pharmaceutical dosage form

    Science.gov (United States)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-02-01

    Four rapid, simple, accurate and precise spectrophotometric methods were used for the determination of ciprofloxacin in the presence of metronidazole as interference. The methods under study are area under the curve, simultaneous equation in addition to smart signal processing techniques of manipulating ratio spectra namely Savitsky-Golay filters and continuous wavelet transform. All the methods were validated according to the ICH guidelines where accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can therefore be used for the routine analysis of ciprofloxacin in quality-control laboratories.

  11. Improving operating room efficiency by applying bin-packing and portfolio techniques to surgical case scheduling

    NARCIS (Netherlands)

    Houdenhoven, van M.; Oostrum, van J.M.; Hans, E.W.; Wullink, G.; Kazemier, G.

    2013-01-01

    BACKGROUND: An operating room (OR) department has adopted an efficient business model and subsequently investigated how efficiency could be further improved. The aim of this study is to show the efficiency improvement of lowering organizational barriers and applying advanced mathematical techniques.

  12. Applying Modern Techniques and Carrying Out English .Extracurricular—— On the Model United Nations Activity

    Institute of Scientific and Technical Information of China (English)

    XuXiaoyu; WangJian

    2004-01-01

    This paper is an introduction of the extracurricular activity of the Model United Nations in Northwestern Polyteehnical University (NPU) and it focuses on the application of the modem techniques in the activity and the pedagogical theories applied in it. An interview and questionnaire research will reveal the influence of the Model United Nations.

  13. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    Science.gov (United States)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  14. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  15. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  16. 浅谈中国男篮突破技术在世锦赛三场比赛中的运用%Analysis of Breakthrough Techniques in Three Games Applied by China Men's Basketball in World Championship

    Institute of Scientific and Technical Information of China (English)

    常妙; 佘涛

    2012-01-01

    运用文献资料法、观察法和数理统计等方法,通过对比中国国家男篮与塞内加尔、斯诺文尼亚和波多黎各三支世界强队比赛时个人突破技术的运用情况.主要分析比赛中突破队员利用比赛中观察面、过人的目的、突破速度、动作连贯性、体力、突破时机的掌握、突破空间的掌握、假动作、动作的连贯性、过人的距离等技术动作来完成变速运球过人、身体掩护过人、档拆过人、假动作过人等方式的水平.由此得出结论:个人突破技术在比赛中的作用非常重要.通过分析探索国家男篮在比赛中个人突破能力的运用,发现国家男篮个人突破技术的若干不足,以期达到提高的目的,为国家队男篮提供训练依据.%According to literature material, observation and mathematic statistics and compared 'with such top -ranked teams as Senegal, Slovenia and Puerto Rico, the individual breakthrough techniques are analyzed in detail. Specifically, such skills of dribbles are discussed as variable speed, body screen, pick - and - roll and fake actions concerned with observation of face, pur- poses, breakthrough speed, action consistency, physical strength, breakthrough time, breakthrough space, fake action and the distance. It is concluded that the individual breakthrough techniques are very important during the game and the shortcomings have found to improve the national team and provide the material for training.

  17. How to Apply Student-centered Teaching Techniques in a Large Class%How to Apply Student-centered Teaching Techniques in a Larae Class

    Institute of Scientific and Technical Information of China (English)

    李焱

    2008-01-01

    It is very common to have a class of 50 or more students in Chinese schools,and teaching a foreign language effectively to a large class is really hard work.In order to change the teacher-centered teaching model into the student-centered one,Teachers should keep students' needs,interests,and learning styles in mind,apply several kinds of teaching techniques,organize different clas$1~OIlll activities and encourage,praise and appreciate both students' success and learning process all the time.If teachers place more responsibility in the hands of students,serve as "presenter or facilitator of knowledge"instead of "source of all knowledge",they can greatly motivate students to learn the language in a very active,cooperative andeffectiveway.After all,peoplelearn by doing,not only by watching andlistening.

  18. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  19. Hands on applied finite element analysis application with ANSYS

    CERN Document Server

    Arslan, Mehmet Ali

    2015-01-01

    Hands on Applied Finite Element Analysis Application with Ansys is truly an extraordinary book that offers practical ways of tackling FEA problems in machine design and analysis. In this book, 35 good selection of example problems have been presented, offering students the opportunity to apply their knowledge to real engineering FEA problem solutions by guiding them with real life hands on experience.

  20. Applied research of environmental monitoring using instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju

    1997-08-01

    This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.

  1. Analysis and comparation of animation techniques

    OpenAIRE

    Joštová, Barbora

    2015-01-01

    This thesis is focused on the analysis and comparison of animation techniques. In the theoretical part of the thesis I define key terms, the historical development and the basic principles of animation techniques. In the practical part I describe the comparison between classic and digital types of animation. Based on this research I chose the most suitable animations that are further used to verify my hypothesis. The provided hypothesis is the order of based on how demanding it is in terms of...

  2. Applied Statistics: From Bivariate through Multivariate Techniques [with CD-ROM

    Science.gov (United States)

    Warner, Rebecca M.

    2007-01-01

    This book provides a clear introduction to widely used topics in bivariate and multivariate statistics, including multiple regression, discriminant analysis, MANOVA, factor analysis, and binary logistic regression. The approach is applied and does not require formal mathematics; equations are accompanied by verbal explanations. Students are asked…

  3. Hybrid chemical and nondestructive analysis technique

    International Nuclear Information System (INIS)

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  4. Hybrid chemical and nondestructive-analysis technique

    Energy Technology Data Exchange (ETDEWEB)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities.

  5. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  6. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

    Directory of Open Access Journals (Sweden)

    L Potgieter

    2012-12-01

    Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

  7. Renormalization techniques applied to the study of density of states in disordered systems

    International Nuclear Information System (INIS)

    A general scheme for real space renormalization of formal scattering theory is presented and applied to the calculation of density of states (DOS) in some finite width systems. This technique is extended in a self-consistent way, to the treatment of disordered and partially ordered chains. Numerical results of moments and DOS are presented in comparison with previous calculations. In addition, a self-consistent theory for the magnetic order problem in a Hubbard chain is derived and a parametric transition is observed. Properties of localization of the electronic states in disordered chains are studied through various decimation averaging techniques and using numerical simulations. (author)

  8. Phase-shifting technique applied to circular harmonic-based joint transform correlator

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The phase-shifting technique is applied to the circular harmonic expansion-based joint transform correlator. Computer simulation has shown that the light efficiency and the discrimination capability are greatly enhanced, and the full rotation invariance is preserved after the phase-shifting technique has been used. A rotation-invariant optical pattern recognition with high discrimination capability and high light efficiency is obtained. The influence of the additive noise on the performance of the correlator is also investigated. However, the anti-noise capability of this kind of correlator still needs improving.

  9. 不同去龋技术在乳牙龋病治疗中的应用疗效分析%Analysis of Different Techniques Applied in the Dental Caries Treatment of Deciduous Teeth

    Institute of Scientific and Technical Information of China (English)

    胡静; 陈增力; 刘继延; 王丽英; 赵文峰

    2015-01-01

    Objective:To compare the effect of conventional mechanical caries removal treatment,atraumatic restorative treatment (ART) and Carisolv chemical caries removal treatment applied in the dental caries treatment of primary teeth.Methods:96 pediatric patients diagnosed of mild/moderate dental caries in deciduous teeth from January 2011 to June 2012 were randomly divided into conventional mechanical removal treatment group,atmumatic restorative treatment (ART) group and Carisolv chemical caries removal treatment group with 60 dentals each.Then operative reaction,operation time and long-term efficacy were recorded and analyzed.Results:The efficacy of dental removal in conventional mechanical caries removal group and Carisolv chemical caries removal treatment group were significantly better than ART group (P<0.05).There was no significant difference in the operation time spent among all treatment groups (P>0.05).The number of intraoperative pain occurs in ART group and Carisolv chemical caries removal treatment group were lower than that on the conventional mechanical caries removal group (P<0.05).In one-year post-treatment,secondary caries incidence in conventional mechanical caries removal group and Carisolv chemical caries removal treatment group were significantly lower than that in ART group(P<0.05).However,there was no significant difference in the incidence of break/loss dental filling among all treatment groups (P>0.05).Conclusions:Carisolv chemical caries removal treatment can effectively relieve intmoperative pain and recurrence rate after operation,which is worthy of promotion and application in clinical treatment of deciduous teeth.Caries removal effects and long-term efficacy of ART may limit its widespread application.%目的:比较传统机械切割法、非创伤性充填法(atraumatic restorative treatment,ART)和Carisolv化学法在临床乳牙龋病治疗中的应用效果差异.方法:选取2011年1月-2012年6月来我科就诊的5-8

  10. Multivariate Statistical Analysis Applied in Wine Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Jieling Zou

    2015-08-01

    Full Text Available This study applies multivariate statistical approaches to wine quality evaluation. With 27 red wine samples, four factors were identified out of 12 parameters by principal component analysis, explaining 89.06% of the total variance of data. As iterative weights calculated by the BP neural network revealed little difference from weights determined by information entropy method, the latter was chosen to measure the importance of indicators. Weighted cluster analysis performs well in classifying the sample group further into two sub-clusters. The second cluster of red wine samples, compared with its first, was lighter in color, tasted thinner and had fainter bouquet. Weighted TOPSIS method was used to evaluate the quality of wine in each sub-cluster. With scores obtained, each sub-cluster was divided into three grades. On the whole, the quality of lighter red wine was slightly better than the darker category. This study shows the necessity and usefulness of multivariate statistical techniques in both wine quality evaluation and parameter selection.

  11. Phase-ratio technique as applied to the assessment of lunar surface roughness

    Science.gov (United States)

    Kaydash, Vadym; Videen, Gorden; Shkuratov, Yuriy

    Regoliths of atmosphereless celestial bodies demonstrate prominent light backscattering that is common for particulate surfaces. This occurs over a wide range of phase angles and can be seen in the phase function [1]. The slope of the function may characterize the complexity of planetary surface structure. Imagery of such a parameter suggests that information can be obtained about the surface, like variations of unresolved surface roughness and microtopography [2]. Phase-ratio imagery allows one to characterize the phase function slope. This imagery requires the ratio of two co-registered images acquired at different phase angles. One important advantage of the procedure is that the inherent albedo variations of the surface are suppressed, and, therefore, the resulting image is sensitive to the surface structure variation [2,3]. The phase-ratio image characterizes surface roughness variation at spatial scales on the order of the incident wavelengths to that of the image resolution. Applying the phase-ratio technique to ground-based telescope data has allowed us to find new lunar surface formations in the southern part of Oceanus Procellarum. These are suggested to be weak swirls [4]. We also combined the phase-ratio technique with the space-derived photometry data acquired from the NASA Lunar Reconnaissance Orbiter with high spatial resolution. Thus we exploited the method to analyze the sites of Apollo landings and Soviet sample-return missions. Phase-ratio imagery has revealed anomalies of the phase-curve slope indicating a smoothing of the surface microstructure at the sites caused by dust uplifted by the engine jets of the descent and ascent modules [5,6]. Analysis of phase-ratios helps to understand how the regolith properties have been affected by robotic and human activity on the Moon [7,8]. We have demonstrated the use of the method to search for fresh natural disturbances of surface structure, e.g., to detect areas of fresh slumps, accumulated material on

  12. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical tec

  13. Applying static code analysis to firewall policies for the purpose of anomaly detection

    OpenAIRE

    Zaliva, Vadim

    2011-01-01

    Treating modern firewall policy languages as imperative, special purpose programming languages, in this article we will try to apply static code analysis techniques for the purpose of anomaly detection. We will first abstract a policy in common firewall policy language into an intermediate language, and then we will try to apply anomaly detection algorithms to it. The contributions made by this work are: 1. An analysis of various control flow instructions in popular firewall policy languages ...

  14. Gold analysis by the gamma absorption technique.

    Science.gov (United States)

    Kurtoglu, Arzu; Tugrul, A Beril

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement. PMID:12485656

  15. Prediction of Quality Features in Iberian Ham by Applying Data Mining on Data From MRI and Computer Vision Techniques

    Directory of Open Access Journals (Sweden)

    Daniel Caballero

    2014-03-01

    Full Text Available This paper aims to predict quality features of Iberian hams by using non-destructive methods of analys is and data mining. Iberian hams were analyzed by Magn etic Resonance Imaging (MRI and Computer Vision Techniques (CVT throughout their ripening process and physico-chemical parameters from them were also measured. The obtained data were used to create an initial database. Deductive techniques ofdata mining (multiple linear regression were used to estimate new data, allowing the insertion of newrecords in the database. Predictive techniques of data mining were applied (multiple linear regression on MRI-CVT data, achieving prediction equations of weight, moisture and lipid content. Finally, data fromprediction equations were compared to data determined by physical-chemical analysis, obtaining high correlation coefficients in most cases. Therefore, data mining, MRI and CVT are suitable tools to esti mate quality traits of Iberian hams. This would improve the control of the ham processing in a non-destruct ive way.

  16. Complementary testing techniques applied to obtain the freeze-thaw resistance of concrete

    Directory of Open Access Journals (Sweden)

    Romero, H. L.

    2015-03-01

    Full Text Available Most of the standards that evaluate the resistance of concrete against freeze-thaw cycles (FTC are based on the loss of weight due to scaling. Such procedures are useful but do not provide information about the microstructural deterioration of the concrete. The test procedure needs to be stopped after several FTCs for weighing the loss of material by scaling. This paper proposes the use of mercury-intrusion-porosimetry and thermogravimetric analysis for assessing the microstructural damage of concrete during FTCs. Continuous strain measurement can be performed without stopping the FTCs. The combination of the above techniques with the freeze-thaw resistance standards provides better and more precise information about concrete damage. The proposed procedure is applied to an ordinary concrete, a concrete with silica fume addition and one with an air-entraining agent. The test results showed that the three techniques used are suitable and useful to be employed as complementary to the standards.Las normas para evaluar la resistencia del hormigón a los ciclos hielo-deshielo (CHD se basan habitualmente en la pérdida de peso por descascarillamiento. Son útiles, pero no proporcionan información sobre el deterioro microestructural del hormigón. Además, exigen detener el ensayo para pesar el material desprendido. Se propone el uso complementario de la porosimetría por intrusión de mercurio y el análisis termogravimétrico para evaluar el daño microestructural del hormigón durante los CHDs. La medida continua de las deformaciones puede hacerse sin detener los CHDs. La combinación de las técnicas enumeradas con las normas de ensayo proporciona información más completa sobre el daño del hormigón. El procedimiento propuesto se aplica a un hormigón convencional, a un hormigón con adición de humo de sílice y a otro con aireante. Los resultados de los ensayos mostraron que las tres técnicas usadas son útiles y adecuadas como complemento a

  17. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  18. Animal Research in the "Journal of Applied Behavior Analysis"

    Science.gov (United States)

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  19. Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.

    Science.gov (United States)

    Iwata, Brian A.

    1987-01-01

    The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…

  20. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  1. Dimensional Analysis with space discrimination applied to Fickian difussion phenomena

    International Nuclear Information System (INIS)

    Dimensional Analysis with space discrimination is applied to Fickian difussion phenomena in order to transform its partial differen-tial equations into ordinary ones, and also to obtain in a dimensionl-ess fom the Ficks second law. (Author)

  2. Applying Discourse Analysis in ELT: a Five Cs Model

    Institute of Scientific and Technical Information of China (English)

    肖巧慧

    2009-01-01

    Based on a discussion of definitions on Discourse analysis,discourse is regard as layers consist of five elements--cohesion, coherence, culture, critique and context. Moreover, we focus on applying DA in ELT.

  3. Towards Applying Text Mining Techniques on Software Quality Standards and Models

    OpenAIRE

    Kelemen, Zádor Dániel; Kusters, Rob; Trienekens, Jos; Balla, Katalin

    2013-01-01

    Many of quality approaches are described in hundreds of textual pages. Manual processing of information consumes plenty of resources. In this report we present a text mining approach applied on CMMI, one well known and widely known quality approach. The text mining analysis can provide a quick overview on the scope of a quality approaches. The result of the analysis could accelerate the understanding and the selection of quality approaches.

  4. Wavelet Techniques Applied to Modeling Transitional/Turbulent Flows in Turbomachinery

    Science.gov (United States)

    1996-01-01

    Computer simulation is an essential part of the design and development of jet engines for the aeropropulsion industry. Engineers concerned with calculating the flow in jet engine components, such as compressors and turbines, need simple engineering models that accurately describe the complex flow of air and gases and that allow them to quickly estimate loads, losses, temperatures, and other design parameters. In this ongoing collaborative project, advanced wavelet analysis techniques are being used to gain insight into the complex flow phenomena. These insights, which cannot be achieved by commonly used methods, are being used to develop innovative new flow models and to improve existing ones. Wavelet techniques are very suitable for analyzing the complex turbulent and transitional flows pervasive in jet engines. These flows are characterized by intermittency and a multitude of scales. Wavelet analysis results in information about these scales and their locations. The distribution of scales is equivalent to the frequency spectrum provided by commonly used Fourier analysis techniques; however, no localization information is provided by Fourier analysis. In addition, wavelet techniques allow conditional sampling analyses of the individual scales, which is not possible by Fourier methods.

  5. Treatment integrity in applied behavior analysis with children.

    OpenAIRE

    F. M. Gresham; Gansle, K A; Noell, G H

    1993-01-01

    Functional analysis of behavior depends upon accurate measurement of both independent and dependent variables. Quantifiable and controllable operations that demonstrate these functional relationships are necessary for a science of human behavior. Failure to implement independent variables with integrity threatens the internal and external validity of experiments. A review of all applied behavior analysis studies with children as subjects that have been published in the Journal of Applied Beha...

  6. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  7. Wire-mesh and ultrasound techniques applied for the characterization of gas-liquid slug flow

    Energy Technology Data Exchange (ETDEWEB)

    Ofuchi, Cesar Y.; Sieczkowski, Wytila Chagas; Neves Junior, Flavio; Arruda, Lucia V.R.; Morales, Rigoberto E.M.; Amaral, Carlos E.F.; Silva, Marco J. da [Federal University of Technology of Parana, Curitiba, PR (Brazil)], e-mails: ofuchi@utfpr.edu.br, wytila@utfpr.edu.br, neves@utfpr.edu.br, lvrarruda@utfpr.edu.br, rmorales@utfpr.edu.br, camaral@utfpr.edu.br, mdasilva@utfpr.edu.br

    2010-07-01

    Gas-liquid two-phase flows are found in a broad range of industrial applications, such as chemical, petrochemical and nuclear industries and quite often determine the efficiency and safety of process and plants. Several experimental techniques have been proposed and applied to measure and quantify two-phase flows so far. In this experimental study the wire-mesh sensor and an ultrasound technique are used and comparatively evaluated to study two-phase slug flows in horizontal pipes. The wire-mesh is an imaging technique and thus appropriated for scientific studies while ultrasound-based technique is robust and non-intrusive and hence well suited for industrial applications. Based on the measured raw data it is possible to extract some specific slug flow parameters of interest such as mean void fraction and characteristic frequency. The experiments were performed in the Thermal Sciences Laboratory (LACIT) at UTFPR, Brazil, in which an experimental two-phase flow loop is available. The experimental flow loop comprises a horizontal acrylic pipe of 26 mm diameter and 9 m length. Water and air were used to produce the two phase flow under controlled conditions. The results show good agreement between the techniques. (author)

  8. Signed directed social network analysis applied to group conflict

    DEFF Research Database (Denmark)

    Zheng, Quan; Skillicorn, David; Walther, Olivier

    2015-01-01

    are both positive and negative), can be combined. This combination is particularly appropriate for intelligence, terrorism, and law enforcement applications. We illustrate by applying the novel embedding technique to datasets describing conflict in North-West Africa, and show how unusual interactions can...

  9. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  10. Innovative vibration technique applied to polyurethane foam as a viable substitute for conventional fatigue testing

    Science.gov (United States)

    Peralta, Alexander; Just-Agosto, Frederick; Shafiq, Basir; Serrano, David

    2012-12-01

    Lifetime prediction using three-point bending (TPB) can at times be prohibitively time consuming and costly, whereas vibration testing at higher frequency may potentially save time and revenue. A vibration technique that obtains lifetimes that reasonably match those determined under flexural TPB fatigue is developed. The technique designs the specimen with a procedure based on shape optimization and finite element analysis. When the specimen is vibrated in resonance, a stress pattern that mimics the stress pattern observed under conventional TPB fatigue testing is obtained. The proposed approach was verified with polyurethane foam specimens, resulting in an average error of 4.5% when compared with TPB.

  11. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  12. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  13. Applied methods and techniques for mechatronic systems modelling, identification and control

    CERN Document Server

    Zhu, Quanmin; Cheng, Lei; Wang, Yongji; Zhao, Dongya

    2014-01-01

    Applied Methods and Techniques for Mechatronic Systems brings together the relevant studies in mechatronic systems with the latest research from interdisciplinary theoretical studies, computational algorithm development and exemplary applications. Readers can easily tailor the techniques in this book to accommodate their ad hoc applications. The clear structure of each paper, background - motivation - quantitative development (equations) - case studies/illustration/tutorial (curve, table, etc.) is also helpful. It is mainly aimed at graduate students, professors and academic researchers in related fields, but it will also be helpful to engineers and scientists from industry. Lei Liu is a lecturer at Huazhong University of Science and Technology (HUST), China; Quanmin Zhu is a professor at University of the West of England, UK; Lei Cheng is an associate professor at Wuhan University of Science and Technology, China; Yongji Wang is a professor at HUST; Dongya Zhao is an associate professor at China University o...

  14. Software engineering techniques applied to agricultural systems an object-oriented and UML approach

    CERN Document Server

    Papajorgji, Petraq J

    2014-01-01

    Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The focus is on the presentation of  rigorous step-by-step approaches for modeling flexible agricultural and environmental systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system.    This second edition includes: a new chapter on Object Constraint Language (OCL), a new section dedicated to the Model-VIEW-Controller (MVC) design pattern, new chapters presenting details of two MDA-based tools – the Virtual Enterprise and Olivia Nova, and a new chapter with exercises on conceptual modeling.  It may be highly useful to undergraduate and graduate students as t...

  15. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  16. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    Directory of Open Access Journals (Sweden)

    Amany AlShawi

    2016-01-01

    Full Text Available Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers, vendors, data distributors, and others. Further, data objects entered into the single cache system can be extended into 12 components. Database and SPSS modelers can be used to implement the same.

  17. A novel technique of in situ phase-shift interferometry applied for faint dissolution of bulky montmorillonite in alkaline solution

    International Nuclear Information System (INIS)

    The effect of alkaline pH on the dissolution rate of bulky aggregated montmorillonite samples at 23°C was investigated for the first time by using an enhanced phase-shift interferometry technique combined with an internal refraction interferometry method developed for this study. This technique was applied to provide a molecular resolution during the optical observation of the dissolution phenomena in real time and in situ while remaining noninvasive. A theoretical normal resolution limit of this technique was 0.78 nm in water for opaque material, but was limited to 6.6 nm for montmorillonite due to the transparency of the montmorillonite crystal. Normal dissolution velocities as low as 1 × 10-4 to 1 × 10-3 nm/s were obtained directly by using the measured temporal change in height of montmorillonite samples set in a reaction cell. The molar dissolution fluxes of montmorillonite obtained in this study gave considerably faster dissolution rates in comparison to those obtained in previous investigations by solution analysis methods. The pH dependence of montmorillonite dissolution rate determined in this study was qualitatively in good agreement with those reported in the previous investigations. The dissolution rates to be used in safety assessments of geological repositories for radioactive wastes should be obtained for bulky samples. This goal has been difficult to achieve using conventional powder experiment technique and solution analysis method, but has been shown to be feasible using the enhanced phase-shift interferometry. (author)

  18. Energy saving techniques applied over a nation-wide mobile network

    DEFF Research Database (Denmark)

    Perez, Eva; Frank, Philipp; Micallef, Gilbert;

    2014-01-01

    Traffic carried over wireless networks has grown significantly in recent years and actual forecasts show that this trend is expected to continue. However, the rapid mobile data explosion and the need for higher data rates comes at a cost of increased complexity and energy consumption of the mobile...... on the energy consumption based on a nation-wide network of a leading European operator. By means of an extensive analysis, we show that with the proposed techniques significant energy savings can be realized....... networks. Although base station equipment is improving its energy efficiency by means of new power amplifiers and increased processing power, additional techniques are required to further reduce the energy consumption. In this paper, we evaluate different energy saving techniques and study their impact...

  19. Applying data mining techniques to medical time series: an empirical case study in electroencephalography and stabilometry.

    Science.gov (United States)

    Anguera, A; Barreiro, J M; Lara, J A; Lizcano, D

    2016-01-01

    One of the major challenges in the medical domain today is how to exploit the huge amount of data that this field generates. To do this, approaches are required that are capable of discovering knowledge that is useful for decision making in the medical field. Time series are data types that are common in the medical domain and require specialized analysis techniques and tools, especially if the information of interest to specialists is concentrated within particular time series regions, known as events. This research followed the steps specified by the so-called knowledge discovery in databases (KDD) process to discover knowledge from medical time series derived from stabilometric (396 series) and electroencephalographic (200) patient electronic health records (EHR). The view offered in the paper is based on the experience gathered as part of the VIIP project. Knowledge discovery in medical time series has a number of difficulties and implications that are highlighted by illustrating the application of several techniques that cover the entire KDD process through two case studies. This paper illustrates the application of different knowledge discovery techniques for the purposes of classification within the above domains. The accuracy of this application for the two classes considered in each case is 99.86% and 98.11% for epilepsy diagnosis in the electroencephalography (EEG) domain and 99.4% and 99.1% for early-age sports talent classification in the stabilometry domain. The KDD techniques achieve better results than other traditional neural network-based classification techniques. PMID:27293535

  20. Pulsed remote field eddy current technique applied to non-magnetic flat conductive plates

    Science.gov (United States)

    Yang, Binfeng; Zhang, Hui; Zhang, Chao; Zhang, Zhanbin

    2013-12-01

    Non-magnetic metal plates are widely used in aviation and industrial applications. The detection of cracks in thick plate structures, such as multilayered structures of aircraft fuselage, has been challenging in nondestructive evaluation societies. The remote field eddy current (RFEC) technique has shown advantages of deep penetration and high sensitivity to deeply buried anomalies. However, the RFEC technique is mainly used to evaluate ferromagnetic tubes. There are many problems that should be fixed before the expansion and application of this technique for the inspection of non-magnetic conductive plates. In this article, the pulsed remote field eddy current (PRFEC) technique for the detection of defects in non-magnetic conducting plates was investigated. First, the principle of the PRFEC technique was analysed, followed by the analysis of the differences between the detection of defects in ferromagnetic and non-magnetic plain structures. Three different models of the PRFEC probe were simulated using ANSYS. The location of the transition zone, defect detection sensitivity and the ability to detect defects in thick plates using three probes were analysed and compared. The simulation results showed that the probe with a ferrite core had the highest detecting ability. The conclusions derived from the simulation study were also validated by conducting experiments.

  1. Applying Data Mining Technique For The Optimal Usage Of Neonatal Incubator

    Directory of Open Access Journals (Sweden)

    Hagar Fady

    2012-07-01

    Full Text Available This research aims to provide intelligent tool to predict incubator Length of Stay (LOS of infants which shall increase the utilization and management of infant incubators. The data sets of Egyptian Neonatal Network (EGNN were employed and Oracle Data Miner (ODM tool was used for the analysis and prediction of data. The obtained results indicated that data mining technique is an appropriate and sufficiently sensitive method to predict required LOS of premature and ill infant.

  2. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  3. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  4. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  5. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  6. ANIMAL RESEARCH IN THE JOURNAL OF APPLIED BEHAVIOR ANALYSIS

    OpenAIRE

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say–do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by...

  7. Negative reinforcement in applied behavior analysis: an emerging technology.

    OpenAIRE

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these area...

  8. Applied Methods for Analysis of Economic Structure and Change

    OpenAIRE

    Anderstig, Christer

    1988-01-01

    The thesis comprises five papers and an introductory overview of applied models and methods. The papers concern interdependences and interrelations in models applied to empirical analyses of various problems related to production, consumption, location and trade. Among different definitions of 'structural analysis' one refers to the study of the properties of economic models on the assumption of invariant structural relations, this definition is close to what is aimed at in lire present case....

  9. A comparison of software- and hardware-gating techniques applied to near-field antenna measurements

    Directory of Open Access Journals (Sweden)

    M. M. Leibfritz

    2007-06-01

    Full Text Available It is well-known that antenna measurements are error prone with respect to reflections within an antenna measurements test facility. The influence on near-field (NF measurements with subsequent NF to far-field (FF transformation can be significantly reduced applying soft- or hard-gating techniques. Hard-gating systems are often used in compact range facilities employing fast PIN-diode switches (Hartmann, 2000 whereas soft-gating systems utilize a network analyzer to gather frequency samples and eliminate objectionable distortions in the time-domain by means of Fourier-transformation techniques. Near-field (NF antenna measurements are known to be sensitive to various errors concerning the measurement setup as there have to be mentioned the accuracy of the positioner, the measurement instruments or the quality of the anechoic chamber itself. Two different approaches employing soft- and hard-gating techniques are discussed with respect to practical applications. Signal generation for the antenna under test (AUT is implemented using a newly developed hard-gating system based on digital signal synthesis allowing gate-widths of 250 ps to 10 ns. Measurement results obtained from a Yagi-Uda antenna under test (AUT and a dual polarized open-ended waveguide used as probe antenna are presented for the GSM 1800 frequency range.

  10. Photothermal Techniques Applied to the Thermal Characterization of l-Cysteine Nanofluids

    Science.gov (United States)

    Alvarado, E. Maldonado; Ramón-Gallegos, E.; Jiménez Pérez, J. L.; Cruz-Orea, A.; Hernández Rosas, J.

    2013-05-01

    Thermal-diffusivity ( D) and thermal-effusivity ( e) measurements were carried out in l-cysteine nanoliquids l-cysteine in combination with Au nanoparticles and protoporphyrin IX (PpIX) nanofluid) by using thermal lens spectrometry (TLS) and photopyroelectric (PPE) techniques. The TLS technique was used in the two mismatched mode experimental configuration to obtain the thermal-diffusivity of the samples. On the other hand, the sample thermal effusivity ( e) was obtained by using the PPE technique where the temperature variation of a sample, exposed to modulated radiation, is measured with a pyrolectric sensor. From the obtained thermal-diffusivity and thermal-effusivity values, the thermal conductivity and specific heat capacity of the sample were calculated. The obtained thermal parameters were compared with the thermal parameters of water. The results of this study could be applied to the detection of tumors by using the l-cysteine in combination with Au nanoparticles and PpIX nanofluid, called conjugated in this study.

  11. Applied Behaviour Analysis. It Works, It's Positive; Now What's the Problem?

    Science.gov (United States)

    Kerr, Ken P.; Mulhern, F.; McDowell, C.

    2000-01-01

    Describes key findings concerning the effectiveness of applied behavior analysis (ABA) for children with autism. Discusses obstacles present in Ireland to treating children with autism using ABA techniques. Describes the work of Parents' Education as Autism Therapists and the Irish Children's Autism Network for Developmental Opportunities to…

  12. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  13. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  14. Review of Intelligent Techniques Applied for Classification and Preprocessing of Medical Image Data

    Directory of Open Access Journals (Sweden)

    H S Hota

    2013-01-01

    Full Text Available Medical image data like ECG, EEG and MRI, CT-scan images are the most important way to diagnose disease of human being in precise way and widely used by the physician. Problem can be clearly identified with the help of these medical images. A robust model can classify the medical image data in better way .In this paper intelligent techniques like neural network and fuzzy logic techniques are explored for MRI medical image data to identify tumor in human brain. Also need of preprocessing of medical image data is explored. Classification technique has been used extensively in the field of medical imaging. The conventional method in medical science for medical image data classification is done by human inspection which may result misclassification of data sometime this type of problem identification are impractical for large amounts of data and noisy data, a noisy data may be produced due to some technical fault of the machine or by human errors and can lead misclassification of medical image data. We have collected number of papers based on neural network and fuzzy logic along with hybrid technique to explore the efficiency and robustness of the model for brain MRI data. It has been analyzed that intelligent model along with data preprocessing using principal component analysis (PCA and segmentation may be the competitive model in this domain.

  15. Pattern-recognition techniques applied to performance monitoring of the DSS 13 34-meter antenna control assembly

    Science.gov (United States)

    Mellstrom, J. A.; Smyth, P.

    1991-01-01

    The results of applying pattern recognition techniques to diagnose fault conditions in the pointing system of one of the Deep Space network's large antennas, the DSS 13 34-meter structure, are discussed. A previous article described an experiment whereby a neural network technique was used to identify fault classes by using data obtained from a simulation model of the Deep Space Network (DSN) 70-meter antenna system. Described here is the extension of these classification techniques to the analysis of real data from the field. The general architecture and philosophy of an autonomous monitoring paradigm is described and classification results are discussed and analyzed in this context. Key features of this approach include a probabilistic time-varying context model, the effective integration of signal processing and system identification techniques with pattern recognition algorithms, and the ability to calibrate the system given limited amounts of training data. Reported here are recognition accuracies in the 97 to 98 percent range for the particular fault classes included in the experiments.

  16. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  17. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  18. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  19. Geostatistical techniques applied to mapping limnological variables and quantify the uncertainty associated with estimates

    Directory of Open Access Journals (Sweden)

    Cristiano Cigagna

    2015-12-01

    Full Text Available Abstract Aim: This study aimed to map the concentrations of limnological variables in a reservoir employing semivariogram geostatistical techniques and Kriging estimates for unsampled locations, as well as the uncertainty calculation associated with the estimates. Methods: We established twenty-seven points distributed in a regular mesh for sampling. Then it was determined the concentrations of chlorophyll-a, total nitrogen and total phosphorus. Subsequently, a spatial variability analysis was performed and the semivariogram function was modeled for all variables and the variographic mathematical models were established. The main geostatistical estimation technique was the ordinary Kriging. The work was developed with the estimate of a heavy grid points for each variables that formed the basis of the interpolated maps. Results: Through the semivariogram analysis was possible to identify the random component as not significant for the estimation process of chlorophyll-a, and as significant for total nitrogen and total phosphorus. Geostatistical maps were produced from the Kriging for each variable and the respective standard deviations of the estimates calculated. These measurements allowed us to map the concentrations of limnological variables throughout the reservoir. The calculation of standard deviations provided the quality of the estimates and, consequently, the reliability of the final product. Conclusions: The use of the Kriging statistical technique to estimate heavy mesh points associated with the error dispersion (standard deviation of the estimate, made it possible to make quality and reliable maps of the estimated variables. Concentrations of limnological variables in general were higher in the lacustrine zone and decreased towards the riverine zone. The chlorophyll-a and total nitrogen correlated comparing the grid generated by Kriging. Although the use of Kriging is more laborious compared to other interpolation methods, this

  20. A Review on Clustering and Outlier Analysis Techniques in Datamining

    Directory of Open Access Journals (Sweden)

    S. Koteeswaran

    2012-01-01

    Full Text Available Problem statement: The modern world is based on using physical, biological and social systems more effectively using advanced computerized techniques. A great amount of data being generated by such systems; it leads to a paradigm shift from classical modeling and analyses based on basic principles to developing models and the corresponding analyses directly from data. The ability to extract useful hidden knowledge in these data and to act on that knowledge is becoming increasingly important in today's competitive world. Approach: The entire process of applying a computer-based methodology, including new techniques, for discovering knowledge from data is called data mining. There are two primary goals in the data mining which are prediction and classification. The larger data involved in the data mining requires clustering and outlier analysis for reducing as well as collecting only useful data set. Results: This study is focusing the review of implementation techniques, recent research on clustering and outlier analysis. Conclusion: The study aims for providing the review of clustering and outlier analysis technique and the discussion on the study will guide the researcher for improving their research direction.

  1. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  2. New Region Growing based on Thresholding Technique Applied to MRI Data

    Directory of Open Access Journals (Sweden)

    A. Afifi

    2015-06-01

    Full Text Available This paper proposes an optimal region growing threshold for the segmentation of magnetic resonance images (MRIs. The proposed algorithm combines local search procedure with thresholding region growing to achieve better generic seeds and optimal thresholds for region growing method. A procedure is used to detect the best possible seeds from a set of data distributed all over the image as a high accumulator of the histogram. The output seeds are fed to the local search algorithm to extract the best seeds around initial seeds. Optimal thresholds are used to overcome the limitations of region growing algorithm and to select the pixels sequentially in a random walk starting at the seed point. The proposed algorithm works automatically without any predefined parameters. The proposed algorithm is applied to the challenging application "gray matter/white matter" segmentation datasets. The experimental results compared with other segmentation techniques show that the proposed algorithm produces more accurate and stable results.

  3. A systematic review of applying modern software engineering techniques to developing robotic systems

    Directory of Open Access Journals (Sweden)

    Claudia Pons

    2012-04-01

    Full Text Available Robots have become collaborators in our daily life. While robotic systems become more and more complex, the need to engineer their software development grows as well. The traditional approaches used in developing these software systems are reaching their limits; currently used methodologies and tools fall short of addressing the needs of such complex software development. Separating robotics’ knowledge from short-cycled implementation technologies is essential to foster reuse and maintenance. This paper presents a systematic review (SLR of the current use of modern software engineering techniques for developing robotic software systems and their actual automation level. The survey was aimed at summarizing existing evidence concerning applying such technologies to the field of robotic systems to identify any gaps in current research to suggest areas for further investigation and provide a background for positioning new research activities.

  4. Electron Correlation Microscopy: A New Technique for Studying Local Atom Dynamics Applied to a Supercooled Liquid.

    Science.gov (United States)

    He, Li; Zhang, Pei; Besser, Matthew F; Kramer, Matthew Joseph; Voyles, Paul M

    2015-08-01

    Electron correlation microscopy (ECM) is a new technique that utilizes time-resolved coherent electron nanodiffraction to study dynamic atomic rearrangements in materials. It is the electron scattering equivalent of photon correlation spectroscopy with the added advantage of nanometer-scale spatial resolution. We have applied ECM to a Pd40Ni40P20 metallic glass, heated inside a scanning transmission electron microscope into a supercooled liquid to measure the structural relaxation time τ between the glass transition temperature T g and the crystallization temperature, T x . τ determined from the mean diffraction intensity autocorrelation function g 2(t) decreases with temperature following an Arrhenius relationship between T g and T g +25 K, and then increases as temperature approaches T x . The distribution of τ determined from the g 2(t) of single speckles is broad and changes significantly with temperature.

  5. Solar coronal magnetic fields derived using seismology techniques applied to omnipresent sunspot waves

    CERN Document Server

    Jess, D B; Ryans, R S I; Christian, D J; Keys, P H; Mathioudakis, M; Mackay, D H; Prasad, S Krishna; Banerjee, D; Grant, S D T; Yau, S; Diamond, C

    2016-01-01

    Sunspots on the surface of the Sun are the observational signatures of intense manifestations of tightly packed magnetic field lines, with near-vertical field strengths exceeding 6,000 G in extreme cases. It is well accepted that both the plasma density and the magnitude of the magnetic field strength decrease rapidly away from the solar surface, making high-cadence coronal measurements through traditional Zeeman and Hanle effects difficult since the observational signatures are fraught with low-amplitude signals that can become swamped with instrumental noise. Magneto-hydrodynamic (MHD) techniques have previously been applied to coronal structures, with single and spatially isolated magnetic field strengths estimated as 9-55 G. A drawback with previous MHD approaches is that they rely on particular wave modes alongside the detectability of harmonic overtones. Here we show, for the first time, how omnipresent magneto-acoustic waves, originating from within the underlying sunspot and propagating radially outwa...

  6. Applying Multi-Criteria Decision-Making Techniques to Prioritize Agility Drivers

    Directory of Open Access Journals (Sweden)

    Ahmad Jafarnejad

    2013-07-01

    Full Text Available It seems that to recognize and classify the factors affecting organizational agility and need to specify the amount of their importance for the organization is essential to preserve survival and success in today's environment. This paper reviews the concept of agility and its division in the following indicators included the factors of motivations organizational agility that have been ranked in terms of level of importance and their influence by the techniques of MCDM. The inner complexity, suppliers, competition, customer needs, market, technology and social factors are the most important factors affecting organizational agility that can evaluate the following indicators and apply them and re-engineering processes, reviews and predictions of customer needs and better understanding of competitive environment and supply chain specify organizational agility and success ultimately.

  7. Full-field speckle correlation technique as applied to blood flow monitoring

    Science.gov (United States)

    Vilensky, M. A.; Agafonov, D. N.; Timoshina, P. A.; Shipovskaya, O. V.; Zimnyakov, D. A.; Tuchin, V. V.; Novikov, P. A.

    2011-03-01

    The results of experimental study of monitoring the microcirculation in tissue superficial layers of the internal organs at gastro-duodenal hemorrhage with the use of laser speckles contrast analysis technique are presented. The microcirculation monitoring was provided in the course of the laparotomy of rat abdominal cavity in the real time. Microscopic hemodynamics was analyzed for small intestine and stomach under different conditions (normal state, provoked ischemia, administration of vasodilative agents such as papaverine, lidocaine). The prospects and problems of internal monitoring of micro-vascular flow in clinical conditions are discussed.

  8. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jeffs, S.P., E-mail: s.p.jeffs@swansea.ac.uk [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Lancaster, R.J. [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Garcia, T.E. [IUTA (University Institute of Industrial Technology of Asturias), University of Oviedo, Edificio Departamental Oeste 7.1.17, Campus Universitario, 33203 Gijón (Spain)

    2015-06-11

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k{sub SP} method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results.

  9. How Can Synchrotron Radiation Techniques Be Applied for Detecting Microstructures in Amorphous Alloys?

    Directory of Open Access Journals (Sweden)

    Gu-Qing Guo

    2015-11-01

    Full Text Available In this work, how synchrotron radiation techniques can be applied for detecting the microstructure in metallic glass (MG is studied. The unit cells are the basic structural units in crystals, though it has been suggested that the co-existence of various clusters may be the universal structural feature in MG. Therefore, it is a challenge to detect microstructures of MG even at the short-range scale by directly using synchrotron radiation techniques, such as X-ray diffraction and X-ray absorption methods. Here, a feasible scheme is developed where some state-of-the-art synchrotron radiation-based experiments can be combined with simulations to investigate the microstructure in MG. By studying a typical MG composition (Zr70Pd30, it is found that various clusters do co-exist in its microstructure, and icosahedral-like clusters are the popular structural units. This is the structural origin where there is precipitation of an icosahedral quasicrystalline phase prior to phase transformation from glass to crystal when heating Zr70Pd30 MG.

  10. An acceleration technique for the Gauss-Seidel method applied to symmetric linear systems

    Directory of Open Access Journals (Sweden)

    Jesús Cajigas

    2014-06-01

    Full Text Available A preconditioning technique to improve the convergence of the Gauss-Seidel method applied to symmetric linear systems while preserving symmetry is proposed. The preconditioner is of the form I + K and can be applied an arbitrary number of times. It is shown that under certain conditions the application of the preconditioner a finite number of steps reduces the matrix to a diagonal. A series of numerical experiments using matrices from spatial discretizations of partial differential equations demonstrates that both versions of the preconditioner, point and block version, exhibit lower iteration counts than its non-symmetric version. Resumen. Se propone una técnica de precondicionamiento para mejorar la convergencia del método Gauss-Seidel aplicado a sistemas lineales simétricos pero preservando simetría. El precondicionador es de la forma I + K y puede ser aplicado un número arbitrario de veces. Se demuestra que bajo ciertas condiciones la aplicación del precondicionador un número finito de pasos reduce la matriz del sistema precondicionado a una diagonal. Una serie de experimentos con matrices que provienen de la discretización de ecuaciones en derivadas parciales muestra que ambas versiones del precondicionador, por punto y por bloque, muestran un menor número de iteraciones en comparación con la versión que no preserva simetría.

  11. Personnel contamination protection techniques applied during the TMI-2 [Three Mile Island Unit 2] cleanup

    International Nuclear Information System (INIS)

    The severe damage to the Three Mile Island Unit 2 (TMI-2) core and the subsequent discharge of reactor coolant to the reactor and auxiliary buildings resulted in extremely hostile radiological environments in the TMI-2 plant. High fission product surface contamination and radiation levels necessitated the implementation of innovative techniques and methods in performing cleanup operations while assuring effective as low as reasonably achievable (ALARA) practices. The approach utilized by GPU Nuclear throughout the cleanup in applying protective clothing requirements was to consider the overall health risk to the worker including factors such as cardiopulmonary stress, visual and hearing acuity, and heat stress. In applying protective clothing requirements, trade-off considerations had to be made between preventing skin contaminations and possibly overprotecting the worker, thus impacting his ability to perform his intended task at maximum efficiency and in accordance with ALARA principles. The paper discusses the following topics: protective clothing-general use, beta protection, skin contamination, training, personnel access facility, and heat stress

  12. Use of statistical techniques in analysis of biological data

    Directory of Open Access Journals (Sweden)

    Farzana Perveen

    2012-07-01

    Full Text Available Starting from the ancient age to the modern times not a single area can be found where statistics is not playing a vital role. Statistics has now been recognized and universally accepted as an essential component of research in every branch of science. Starting from agriculture, biology, education, economics, business, management, medical, engineering, psychology, environment and space, statistics is playing significant role. Statistics is being extensively used in biological sciences. Specifically, biostatistics is the branch of applied statistics that concerns the application of statistical methods to medical, genetics and biological problems. In the sequel, one important step is the appropriate and careful analysis of statistical data to get precise results. It is pertinent to mention that majority of statistical tests and techniques are applied under certain mathematical assumptions. Therefore, it is necessary to realize the importance of relevant assumptions. In this connection, among other assumptions, the assumption of normality (normal distribution of population(s and variance homogeneity etc. are the most important. If these assumptions are not satisfied, the results may be potentially misleading. It is, therefore, suggested to check the relevant assumption(s about the data before applying statistical test(s to get valid results. In this study, a few techniques/tests have been described for checking the normality of a given set of data. Since the Analysis of variance (ANOVA models are extensively used in biological research, therefore, the assumptions underlying the ANOVA have also been discussed. Non-parametric statistics is also described to some extent.

  13. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching.

    Science.gov (United States)

    Joyce, B; Moxley, R A

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis.

  14. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  15. Microarray Analysis Techniques Singular Value Decomposition and Principal Component Analysis

    CERN Document Server

    Wall, M E; Rocha, L M; Wall, Michael E.; Rechtsteiner, Andreas; Rocha, Luis M.

    2002-01-01

    This chapter describes gene expression analysis by Singular Value Decomposition (SVD), emphasizing initial characterization of the data. We describe SVD methods for visualization of gene expression data, representation of the data using a smaller number of variables, and detection of patterns in noisy gene expression data. In addition, we describe the precise relation between SVD analysis and Principal Component Analysis (PCA) when PCA is calculated using the covariance matrix, enabling our descriptions to apply equally well to either method. Our aim is to provide definitions, interpretations, examples, and references that will serve as resources for understanding and extending the application of SVD and PCA to gene expression analysis.

  16. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan

    2016-04-01

    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  17. Nondestructive analysis of oil shales with PGNAA technique

    International Nuclear Information System (INIS)

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water

  18. Nondestructive analysis of oil shales with PGNAA technique

    Energy Technology Data Exchange (ETDEWEB)

    Maly, J.; Bozorgmanesh, H.

    1984-02-01

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water.

  19. 2D and 3D optical diagnostic techniques applied to Madonna dei Fusi by Leonardo da Vinci

    Science.gov (United States)

    Fontana, R.; Gambino, M. C.; Greco, M.; Marras, L.; Materazzi, M.; Pampaloni, E.; Pelagotti, A.; Pezzati, L.; Poggi, P.; Sanapo, C.

    2005-06-01

    3D measurement and modelling have been traditionally applied to statues, buildings, archeological sites or similar large structures, but rarely to paintings. Recently, however, 3D measurements have been performed successfully also on easel paintings, allowing to detect and document the painting's surface. We used 3D models to integrate the results of various 2D imaging techniques on a common reference frame. These applications show how the 3D shape information, complemented with 2D colour maps as well as with other types of sensory data, provide the most interesting information. The 3D data acquisition was carried out by means of two devices: a high-resolution laser micro-profilometer, composed of a commercial distance meter mounted on a scanning device, and a laser-line scanner. The 2D data acquisitions were carried out using a scanning device for simultaneous RGB colour imaging and IR reflectography, and a UV fluorescence multispectral image acquisition system. We present here the results of the techniques described, applied to the analysis of an important painting of the Italian Reinassance: `Madonna dei Fusi', attributed to Leonardo da Vinci.

  20. Predicting Performance of Schools by Applying Data Mining Techniques on Public Examination Results

    Directory of Open Access Journals (Sweden)

    J. Macklin Abraham Navamani

    2015-02-01

    Full Text Available This study work presents a systematic analysis of various features of the higher grade school public examination results data in the state of Tamil Nadu, India through different data mining classification algorithms to predict the performance of Schools. Nowadays the parents always targets to select the right city, school and factors which contributes to the success of the results in schools of their children. There could be possible effects of factors such as Ethnic mix, Medium of study, geography could make a difference in results. The proposed work would focus on two fold factors namely Machine Learning algorithms to predict School performance with satisfying accuracy and to evaluate the data mining technique which would give better accuracy of the learning algorithms. It was found that there exist some apparent and some less noticeable attributes that demonstrate a strong correlation with student performance. Data were collected through the credible source data preparation and correlation analysis. The findings revealed that the public examinations results data was a very helpful predictor of performance of school in order to improve the result with maximum level and also improved the overall accuracy with the help of Adaboost technique.

  1. Vibrational techniques applied to photosynthesis: Resonance Raman and fluorescence line-narrowing.

    Science.gov (United States)

    Gall, Andrew; Pascal, Andrew A; Robert, Bruno

    2015-01-01

    Resonance Raman spectroscopy may yield precise information on the conformation of, and the interactions assumed by, the chromophores involved in the first steps of the photosynthetic process. Selectivity is achieved via resonance with the absorption transition of the chromophore of interest. Fluorescence line-narrowing spectroscopy is a complementary technique, in that it provides the same level of information (structure, conformation, interactions), but in this case for the emitting pigment(s) only (whether isolated or in an ensemble of interacting chromophores). The selectivity provided by these vibrational techniques allows for the analysis of pigment molecules not only when they are isolated in solvents, but also when embedded in soluble or membrane proteins and even, as shown recently, in vivo. They can be used, for instance, to relate the electronic properties of these pigment molecules to their structure and/or the physical properties of their environment. These techniques are even able to follow subtle changes in chromophore conformation associated with regulatory processes. After a short introduction to the physical principles that govern resonance Raman and fluorescence line-narrowing spectroscopies, the information content of the vibrational spectra of chlorophyll and carotenoid molecules is described in this article, together with the experiments which helped in determining which structural parameter(s) each vibrational band is sensitive to. A selection of applications is then presented, in order to illustrate how these techniques have been used in the field of photosynthesis, and what type of information has been obtained. This article is part of a Special Issue entitled: Vibrational spectroscopies and bioenergetic systems. PMID:25268562

  2. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  3. Bone feature analysis using image processing techniques.

    Science.gov (United States)

    Liu, Z Q; Austin, T; Thomas, C D; Clement, J G

    1996-01-01

    In order to establish the correlation between bone structure and age, and information about age-related bone changes, it is necessary to study microstructural features of human bone. Traditionally, in bone biology and forensic science, the analysis if bone cross-sections has been carried out manually. Such a process is known to be slow, inefficient and prone to human error. Consequently, the results obtained so far have been unreliable. In this paper we present a new approach to quantitative analysis of cross-sections of human bones using digital image processing techniques. We demonstrate that such a system is able to extract various bone features consistently and is capable of providing more reliable data and statistics for bones. Consequently, we will be able to correlate features of bone microstructure with age and possibly also with age related bone diseases such as osteoporosis. The development of knowledge-based computer vision-systems for automated bone image analysis can now be considered feasible.

  4. The Split-Apply-Combine Strategy for Data Analysis

    Directory of Open Access Journals (Sweden)

    Hadley Wickham

    2011-04-01

    Full Text Available Many data analysis problems involve the application of a split-apply-combine strategy, where you break up a big problem into manageable pieces, operate on each piece independently and then put all the pieces back together. This insight gives rise to a new R package that allows you to smoothly apply this strategy, without having to worry about the type of structure in which your data is stored.The paper includes two case studies showing how these insights make it easier to work with batting records for veteran baseball players and a large 3d array of spatio-temporal ozone measurements.

  5. A Methods and procedures to apply probabilistic safety Assessment (PSA) techniques to the cobalt-therapy process. Cuban experience

    International Nuclear Information System (INIS)

    This paper presents the results of the Probabilistic Safety Analysis (PSA) to the Cobalt Therapy Process, which was performed as part of the International Atomic Energy Agency's Coordinated Research Project (CRP) to Investigate Appropriate Methods and Procedures to Apply Probabilistic Safety Assessment (PSA) Techniques to Large Radiation Sources. The primary methodological tools used in the analysis were Failure Modes and Effects Analysis (FMEA), Event Trees and Fault Trees. These tools were used to evaluate occupational, public and medical exposures during cobalt therapy treatment. The emphasis of the study was on the radiological protection of patients. During the course of the PSA, several findings were analysed concerning the cobalt treatment process. In relation with the Undesired Events Probabilities, the lowest exposures probabilities correspond to the public exposures during the treatment process (Z21); around 10-10 per year, being the workers exposures (Z11); around 10-4 per year. Regarding to the patient, the Z33 probabilities prevail (not desired dose to normal tissue) and Z34 (not irradiated portion to target volume). Patient accidental exposures are also classified in terms of the extent to which the error is likely to affect individual treatments, individual patients, or all the patients treated on a specific unit. Sensitivity analyses were realised to determine the influence of certain tasks or critical stages on the results. As a conclusion the study establishes that the PSA techniques may effectively and reasonably determine the risk associated to the cobalt-therapy treatment process, though there are some weaknesses in its methodological application for this kind of study requiring further research. These weaknesses are due to the fact that the traditional PSA has been mainly applied to complex hardware systems designed to operate with a high automation level, whilst the cobalt therapy treatment is a relatively simple hardware system with a

  6. The x-rays fluorescence applied to the analysis of alloys

    International Nuclear Information System (INIS)

    This work is based on the utilization of the Fluorescence of X Rays. This technique of non destructive trial, has the purpose to establish a routine method, for the control of the conformation of industrial samples used. It makes an analysis with a combination of the algorithms of Rasberry-Heinrich and Claisse-Thinh. Besides, the numerical implementation of non usual techniques in this type of analysis. Such as the Linear Programming applied to the solution of super determined systems, of equations and the utilization of methods of relaxation to facilitate the convergence to the solutions. (author)

  7. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  8. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters (125I, 57Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  9. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  10. Applied research on air pollution using nuclear-related analytical techniques. Report on the second research co-ordination meeting

    International Nuclear Information System (INIS)

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which started in 1992, and is scheduled to run until early 1997. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XRF, and PIXE for the analysis of toxic and other trace elements in air particulate matter. The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for research and monitoring studies on air pollution, ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural area). This document reports the discussions held during the second Research Co-ordination Meeting (RCM) for the CRP which took place at ANSTO in Menai, Australia. (author)

  11. Modern structure of methods and techniques of marketing research, applied by the world and Ukrainian research companies

    Directory of Open Access Journals (Sweden)

    Bezkrovnaya Yulia

    2015-08-01

    Full Text Available The article presents the results of empiric justification of the structure of methods and techniques of marketing research of consumer decisions, applied by the world and Ukrainian research companies.

  12. Applying ABC analysis to the Navy's inventory management system

    OpenAIRE

    May, Benjamin

    2014-01-01

    Approved for public release; distribution is unlimited ABC Analysis is an inventory categorization technique used to classify and prioritize inventory items in an effort to better allocate business resources. A items are defined as the inventory items considered extremely important to the business, requiring strict oversight and control. B items are important to the business, but don’t require the tight controls and oversight required of the A items. C items are marginally important to the...

  13. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    Science.gov (United States)

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists. PMID:26853375

  14. Applying data fusion techniques for benthic habitat mapping and monitoring in a coral reef ecosystem

    Science.gov (United States)

    Zhang, Caiyun

    2015-06-01

    Accurate mapping and effective monitoring of benthic habitat in the Florida Keys are critical in developing management strategies for this valuable coral reef ecosystem. For this study, a framework was designed for automated benthic habitat mapping by combining multiple data sources (hyperspectral, aerial photography, and bathymetry data) and four contemporary imagery processing techniques (data fusion, Object-based Image Analysis (OBIA), machine learning, and ensemble analysis). In the framework, 1-m digital aerial photograph was first merged with 17-m hyperspectral imagery and 10-m bathymetry data using a pixel/feature-level fusion strategy. The fused dataset was then preclassified by three machine learning algorithms (Random Forest, Support Vector Machines, and k-Nearest Neighbor). Final object-based habitat maps were produced through ensemble analysis of outcomes from three classifiers. The framework was tested for classifying a group-level (3-class) and code-level (9-class) habitats in a portion of the Florida Keys. Informative and accurate habitat maps were achieved with an overall accuracy of 88.5% and 83.5% for the group-level and code-level classifications, respectively.

  15. Applying value stream mapping techniques to eliminate non-value-added waste for the procurement of endovascular stents

    International Nuclear Information System (INIS)

    Objectives: To eliminate non-value-adding (NVA) waste for the procurement of endovascular stents in interventional radiology services by applying value stream mapping (VSM). Materials and methods: The Lean manufacturing technique was used to analyze the process of material and information flow currently required to direct endovascular stents from external suppliers to patients. Based on a decision point analysis for the procurement of stents in the hospital, a present state VSM was drawn. After assessment of the current status VSM and progressive elimination of unnecessary NVA waste, a future state VSM was drawn. Results: The current state VSM demonstrated that out of 13 processes for the procurement of stents only 2 processes were value-adding. Out of the NVA processes 5 processes were unnecessary NVA activities, which could be eliminated. The decision point analysis demonstrated that the procurement of stents was mainly a forecast driven push system. The future state VSM applies a pull inventory control system to trigger the movement of a unit after withdrawal by using a consignment stock. Conclusion: VSM is a visualization tool for the supply chain and value stream, based on the Toyota Production System and greatly assists in successfully implementing a Lean system.

  16. Numerical continuation applied to landing gear mechanism analysis

    OpenAIRE

    Knowles, J.; Krauskopf, B; Lowenberg, MH

    2010-01-01

    A method of investigating quasi-static mechanisms is presented and applied to an overcentre mechanism and to a nose landing gear mechanism. The method uses static equilibrium equations along with equations describing the geometric constraints in the mechanism. In the spirit of bifurcation analysis, solutions to these steady-state equations are then continued numerically in parameters of interest. Results obtained from the bifurcation method agree with the equivalent results obtained from two ...

  17. Nonstandard Analysis Applied to Advanced Undergraduate Mathematics - Infinitesimal Modeling

    OpenAIRE

    Herrmann, Robert A.

    2003-01-01

    This is a Research and Instructional Development Project from the U. S. Naval Academy. In this monograph, the basic methods of nonstandard analysis for n-dimensional Euclidean spaces are presented. Specific rules are deveoped and these methods and rules are applied to rigorous integral and differential modeling. The topics include Robinson infinitesimals, limited and infinite numbers; convergence theory, continuity, *-transfer, internal definition, hyprefinite summation, Riemann-Stieltjes int...

  18. An applied ethics analysis of best practice tourism entrepreneurs

    OpenAIRE

    Power, Susann

    2015-01-01

    Ethical entrepreneurship and by extension wider best practice are noble goals for the future of tourism. However, questions arise which concepts, such as values motivations, actions and challenges underpin these goals. This thesis seeks to answers these questions and in so doing develop an applied ethics analysis for best practice entrepreneurs in tourism. The research is situated in sustainable tourism, which is ethically very complex and has thus far been dominated by the economic, social a...

  19. Recent reinforcement-schedule research and applied behavior analysis

    OpenAIRE

    Lattal, Kennon A.; Neef, Nancy A.

    1996-01-01

    Reinforcement schedules are considered in relation to applied behavior analysis by examining several recent laboratory experiments with humans and other animals. The experiments are drawn from three areas of contemporary schedule research: behavioral history effects on schedule performance, the role of instructions in schedule performance of humans, and dynamic schedules of reinforcement. All of the experiments are discussed in relation to the role of behavioral history in current schedule pe...

  20. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  1. Applying Squeezing Technique to Clayrocks: Lessons Learned from Experiments at Mont Terri Rock Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, A. M.; Sanchez-Ledesma, D. M.; Tournassat, C.; Melon, A.; Gaucher, E.; Astudillo, E.; Vinsot, A.

    2013-07-01

    Knowledge of the pore water chemistry in clay rock formations plays an important role in determining radionuclide migration in the context of nuclear waste disposal. Among the different in situ and ex-situ techniques for pore water sampling in clay sediments and soils, squeezing technique dates back 115 years. Although different studies have been performed about the reliability and representativeness of squeezed pore waters, more of them were achieved on high porosity, high water content and unconsolidated clay sediments. A very few of them tackled the analysis of squeezed pore water from low-porosity, low water content and highly consolidated clay rocks. In this work, a specially designed and fabricated one-dimensional compression cell two directional fluid flow was used to extract and analyse the pore water composition of Opalinus Clay core samples from Mont Terri (Switzerland). The reproducibility of the technique is good and no ionic ultrafiltration, chemical fractionation or anion exclusion was found in the range of pressures analysed: 70-200 MPa. Pore waters extracted in this range of pressures do not decrease in concentration, which would indicate a dilution of water by mixing of the free pore water and the outer layers of double layer water (Donnan water). A threshold (safety) squeezing pressure of 175 MPa was established for avoiding membrane effects (ion filtering, anion exclusion, etc.) from clay particles induced by increasing pressures. Besides, the pore waters extracted at these pressures are representative of the Opalinus Clay formation from a direct comparison against in situ collected borehole waters. (Author)

  2. Applying Squeezing Technique to Clayrocks: Lessons Learned from Experiments at Mont Terri Rock Laboratory

    International Nuclear Information System (INIS)

    Knowledge of the pore water chemistry in clay rock formations plays an important role in determining radionuclide migration in the context of nuclear waste disposal. Among the different in situ and ex-situ techniques for pore water sampling in clay sediments and soils, squeezing technique dates back 115 years. Although different studies have been performed about the reliability and representativeness of squeezed pore waters, more of them were achieved on high porosity, high water content and unconsolidated clay sediments. A very few of them tackled the analysis of squeezed pore water from low-porosity, low water content and highly consolidated clay rocks. In this work, a specially designed and fabricated one-dimensional compression cell two directional fluid flow was used to extract and analyse the pore water composition of Opalinus Clay core samples from Mont Terri (Switzerland). The reproducibility of the technique is good and no ionic ultrafiltration, chemical fractionation or anion exclusion was found in the range of pressures analysed: 70-200 MPa. Pore waters extracted in this range of pressures do not decrease in concentration, which would indicate a dilution of water by mixing of the free pore water and the outer layers of double layer water (Donnan water). A threshold (safety) squeezing pressure of 175 MPa was established for avoiding membrane effects (ion filtering, anion exclusion, etc.) from clay particles induced by increasing pressures. Besides, the pore waters extracted at these pressures are representative of the Opalinus Clay formation from a direct comparison against in situ collected borehole waters. (Author)

  3. Harmonic and applied analysis from groups to signals

    CERN Document Server

    Mari, Filippo; Grohs, Philipp; Labate, Demetrio

    2015-01-01

    This contributed volume explores the connection between the theoretical aspects of harmonic analysis and the construction of advanced multiscale representations that have emerged in signal and image processing. It highlights some of the most promising mathematical developments in harmonic analysis in the last decade brought about by the interplay among different areas of abstract and applied mathematics. This intertwining of ideas is considered starting from the theory of unitary group representations and leading to the construction of very efficient schemes for the analysis of multidimensional data. After an introductory chapter surveying the scientific significance of classical and more advanced multiscale methods, chapters cover such topics as An overview of Lie theory focused on common applications in signal analysis, including the wavelet representation of the affine group, the Schrödinger representation of the Heisenberg group, and the metaplectic representation of the symplectic group An introduction ...

  4. Gamma absorption technique in elemental analysis of composite materials

    International Nuclear Information System (INIS)

    Highlights: ► Application of gamma-ray absorption technique in elemental analysis. ► Determination of elemental composition of some bronze and gold alloys. ► Determination of some heavy elements in water. - Abstract: Expressions for calculating the elemental concentrations of composite materials based on a gamma absorption technique are derived. These expressions provide quantitative information about elemental concentrations of materials. Calculations are carried out for estimating the concentrations of copper and gold in some alloys of bronze and gold. The method was also applied for estimating the concentrations of some heavy elements in a water matrix highlighting the differences with photon attenuation measurements. Theoretical mass attenuation coefficient values were obtained using the WinXCom program. A high-resolution gamma-ray spectrometry based on high purity germanium detector (HPGe) was employed to measure the attenuation of a strongly collimated monoenergetic gamma beam through samples.

  5. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  6. Conference report: summary of the 2010 Applied Pharmaceutical Analysis Conference.

    Science.gov (United States)

    Unger, Steve E

    2011-01-01

    This year, the Applied Pharmaceutical Analysis meeting changed its venue to the Grand Tremont Hotel in Baltimore, MD, USA. Proximity to Washington presented the opportunity to have four speakers from the US FDA. The purpose of the 4-day conference is to provide a forum in which pharmaceutical and CRO scientists can discuss and develop best practices for scientific challenges in bioanalysis and drug metabolism. This year's theme was 'Bioanalytical and Biotransformation Challenges in Meeting Global Regulatory Expectations & New Technologies for Drug Discovery Challenges'. Applied Pharmaceutical Analysis continued its tradition of highlighting new technologies and its impact on drug discovery, drug metabolism and small molecule-regulated bioanalysis. This year, the meeting included an integrated focus on metabolism in drug discovery and development. Middle and large molecule (biotherapeutics) drug development, immunoassay, immunogenicity and biomarkers were also integrated into the forum. Applied Pharmaceutical Analysis offered an enhanced diversity of topics this year while continuing to share experiences of discovering and developing new medicines. PMID:21175361

  7. Nuclear analytical techniques applied to forensic chemistry; Aplicacion de tecnicas analiticas nucleares en quimica forense

    Energy Technology Data Exchange (ETDEWEB)

    Nicolau, Veronica; Montoro, Silvia [Universidad Nacional del Litoral, Santa Fe (Argentina). Facultad de Ingenieria Quimica. Dept. de Quimica Analitica; Pratta, Nora; Giandomenico, Angel Di [Consejo Nacional de Investigaciones Cientificas y Tecnicas, Santa Fe (Argentina). Centro Regional de Investigaciones y Desarrollo de Santa Fe

    1999-11-01

    Gun shot residues produced by firing guns are mainly composed by visible particles. The individual characterization of these particles allows distinguishing those ones containing heavy metals, from gun shot residues, from those having a different origin or history. In this work, the results obtained from the study of gun shot residues particles collected from hands are presented. The aim of the analysis is to establish whether a person has shot a firing gun has been in contact with one after the shot has been produced. As reference samples, particles collected hands of persons affected to different activities were studied to make comparisons. The complete study was based on the application of nuclear analytical techniques such as Scanning Electron Microscopy, Energy Dispersive X Ray Electron Probe Microanalysis and Graphite Furnace Atomic Absorption Spectrometry. The essays allow to be completed within time compatible with the forensic requirements. (author) 5 refs., 3 figs., 1 tab.; e-mail: csedax e adigian at arcride.edu.ar

  8. Vibroacoustic Modeling of Mechanically Coupled Structures: Artificial Spring Technique Applied to Light and Heavy Mediums

    Directory of Open Access Journals (Sweden)

    L. Cheng

    1996-01-01

    Full Text Available This article deals with the modeling of vibrating structures immersed in both light and heavy fluids, and possible applications to noise control problems and industrial vessels containing fluids. A theoretical approach, using artificial spring systems to characterize the mechanical coupling between substructures, is extended to include fluid loading. A structure consisting of a plate-ended cylindrical shell and its enclosed acoustic cavity is analyzed. After a brief description of the proposed technique, a number of numerical results are presented. The analysis addresses the following specific issues: the coupling between the plate and the shell; the coupling between the structure and the enclosure; the possibilities and difficulties regarding internal soundproofing through modifications of the joint connections; and the effects of fluid loading on the vibration of the structure.

  9. Inverting travel times with a triplication. [spline fitting technique applied to lunar seismic data reduction

    Science.gov (United States)

    Jarosch, H. S.

    1982-01-01

    A method based on the use of constrained spline fits is used to overcome the difficulties arising when body-wave data in the form of T-delta are reduced to the tau-p form in the presence of cusps. In comparison with unconstrained spline fits, the method proposed here tends to produce much smoother models which lie approximately in the middle of the bounds produced by the extremal method. The method is noniterative and, therefore, computationally efficient. The method is applied to the lunar seismic data, where at least one triplication is presumed to occur in the P-wave travel-time curve. It is shown, however, that because of an insufficient number of data points for events close to the antipode of the center of the lunar network, the present analysis is not accurate enough to resolve the problem of a possible lunar core.

  10. The Study of Mining Activities and their Influences in the Almaden Region Applying Remote Sensing Techniques

    International Nuclear Information System (INIS)

    This scientific-technical report is a part of an ongoing research work carried out by Celia Rico Fraile in order to obtain the Diploma of Advanced Studies as part of her PhD studies. This work has been developed in collaboration with the Faculty of Science at The Universidad Autonoma de Madrid and the Department of Environment at CIEMAT. The main objective of this work was the characterization and classification of land use in Almaden (Ciudad Real) during cinnabar mineral exploitation and after mining activities ceased in 2002, developing a methodology focused on the integration of remote sensing techniques applying multispectral and hyper spectral satellite data. By means of preprocessing and processing of data from the satellite images as well as data obtained from field campaigns, a spectral library was compiled in order to obtain representative land surfaces within the study area. Monitoring results show that the distribution of areas affected by mining activities is rapidly diminishing in recent years. (Author) 130 refs

  11. Advanced examination techniques applied to the qualification of critical welds for the ITER correction coils

    CERN Document Server

    Sgobba, Stefano; Libeyre, Paul; Marcinek, Dawid Jaroslaw; Piguiet, Aline; Cécillon, Alexandre

    2015-01-01

    The ITER correction coils (CCs) consist of three sets of six coils located in between the toroidal (TF) and poloidal field (PF) magnets. The CCs rely on a Cable-in-Conduit Conductor (CICC), whose supercritical cooling at 4.5 K is provided by helium inlets and outlets. The assembly of the nozzles to the stainless steel conductor conduit includes fillet welds requiring full penetration through the thickness of the nozzle. Static and cyclic stresses have to be sustained by the inlet welds during operation. The entire volume of helium inlet and outlet welds, that are submitted to the most stringent quality levels of imperfections according to standards in force, is virtually uninspectable with sufficient resolution by conventional or computed radiography or by Ultrasonic Testing. On the other hand, X-ray computed tomography (CT) was successfully applied to inspect the full weld volume of several dozens of helium inlet qualification samples. The extensive use of CT techniques allowed a significant progress in the ...

  12. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  13. Comparison of motion correction techniques applied to functional near-infrared spectroscopy data from children

    Science.gov (United States)

    Hu, Xiao-Su; Arredondo, Maria M.; Gomba, Megan; Confer, Nicole; DaSilva, Alexandre F.; Johnson, Timothy D.; Shalinsky, Mark; Kovelman, Ioulia

    2015-12-01

    Motion artifacts are the most significant sources of noise in the context of pediatric brain imaging designs and data analyses, especially in applications of functional near-infrared spectroscopy (fNIRS), in which it can completely affect the quality of the data acquired. Different methods have been developed to correct motion artifacts in fNIRS data, but the relative effectiveness of these methods for data from child and infant subjects (which is often found to be significantly noisier than adult data) remains largely unexplored. The issue is further complicated by the heterogeneity of fNIRS data artifacts. We compared the efficacy of the six most prevalent motion artifact correction techniques with fNIRS data acquired from children participating in a language acquisition task, including wavelet, spline interpolation, principal component analysis, moving average (MA), correlation-based signal improvement, and combination of wavelet and MA. The evaluation of five predefined metrics suggests that the MA and wavelet methods yield the best outcomes. These findings elucidate the varied nature of fNIRS data artifacts and the efficacy of artifact correction methods with pediatric populations, as well as help inform both the theory and practice of optical brain imaging analysis.

  14. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  15. Joint regression analysis and AMMI model applied to oat improvement

    Science.gov (United States)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  16. Investigation about the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils

    Directory of Open Access Journals (Sweden)

    Adriano Pinto Mariano

    2009-10-01

    Full Text Available This work investigated the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils collected at three service stations. Batch biodegradation experiments were carried out in Bartha biometer flasks (250 mL used to measure the microbial CO2 production. Biodegradation efficiency was also measured by quantifying the concentration of hydrocarbons. In addition to the biodegradation experiments, the capability of the studied cultures and the native microorganisms to biodegrade the diesel oil purchased from a local service station, was verified using a technique based on the redox indicator 2,6 -dichlorophenol indophenol (DCPIP. Results obtained with this test showed that the inocula used in the biodegradation experiments were able to degrade the diesel oil and the tests carried out with the native microorganisms indicated that these soils had a microbiota adapted to degrade the hydrocarbons. In general, no gain was obtained with the addition of microorganisms or even negative effects were observed in the biodegradation experiments.Este trabalho investigou a eficiência da técnica do bioaumento quando aplicada a solos contaminados com óleo diesel coletados em três postos de combustíveis. Experimentos de biodegradação foram realizados em frascos de Bartha (250 mL, usados para medir a produção microbiana de CO2. A eficiência de biodegradação também foi quantificada pela concentração de hidrocarbonetos. Conjuntamente aos experimentos de biodegradação, a capacidade das culturas estudadas e dos microrganismos nativos em biodegradar óleo diesel comprado de um posto de combustíveis local, foi verificada utilizando-se a técnica baseada no indicador redox 2,6 - diclorofenol indofenol (DCPIP. Resultados obtidos com esse teste mostraram que os inóculos empregados nos experimentos de biodegradação foram capazes de biodegradar óleo diesel e os testes com os microrganismos nativos indicaram que estes solos

  17. A comparison of new, old and future densiometic techniques as applied to volcanologic study.

    Science.gov (United States)

    Pankhurst, Matthew; Moreland, William; Dobson, Kate; Þórðarson, Þorvaldur; Fitton, Godfrey; Lee, Peter

    2015-04-01

    The density of any material imposes a primary control upon its potential or actual physical behaviour in relation to its surrounds. It follows that a thorough understanding of the physical behaviour of dynamic, multi-component systems, such as active volcanoes, requires knowledge of the density of each component. If we are to accurately predict the physical behaviour of synthesized or natural volcanic systems, quantitative densiometric measurements are vital. The theoretical density of melt, crystals and bubble phases may be calculated using composition, structure, temperature and pressure inputs. However, measuring the density of natural, non-ideal, poly-phase materials remains problematic, especially if phase specific measurement is important. Here we compare three methods; Archimedes principle, He-displacement pycnometry and X-ray micro computed tomography (XMT) and discuss the utility and drawbacks of each in the context of modern volcanologic study. We have measured tephra, ash and lava from the 934 AD Eldgjá eruption (Iceland), and the 2010 AD Eyjafjallajökull eruption (Iceland), using each technique. These samples exhibit a range of particle sizes, phases and textures. We find that while the Archimedes method remains a useful, low-cost technique to generate whole-rock density data, relative precision is problematic at small particles sizes. Pycnometry offers a more precise whole-rock density value, at a comparable cost-per-sample. However, this technique is based upon the assumption pore spaces within the sample are equally available for gas exchange, which may or may not be the case. XMT produces 3D images, at resolutions from nm to tens of µm per voxel where X-ray attenuation is a qualitative measure of relative electron density, expressed as greyscale number/brightness (usually 16-bit). Phases and individual particles can be digitally segmented according to their greyscale and other characteristics. This represents a distinct advantage over both

  18. Thermal Analysis Applied to Verapamil Hydrochloride Characterization in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    Maria Irene Yoshida

    2010-04-01

    Full Text Available Thermogravimetry (TG and differential scanning calorimetry (DSC are useful techniques that have been successfully applied in the pharmaceutical industry to reveal important information regarding the physicochemical properties of drug and excipient molecules such as polymorphism, stability, purity, formulation compatibility among others. Verapamil hydrochloride shows thermal stability up to 180 °C and melts at 146 °C, followed by total degradation. The drug is compatible with all the excipients evaluated. The drug showed degradation when subjected to oxidizing conditions, suggesting that the degradation product is 3,4-dimethoxybenzoic acid derived from alkyl side chain oxidation. Verapamil hydrochloride does not present the phenomenon of polymorphism under the conditions evaluated. Assessing the drug degradation kinetics, the drug had a shelf life (t90 of 56.7 years and a pharmaceutical formulation showed t90 of 6.8 years showing their high stability.

  19. Imaging techniques applied to quality control of civil manufactured goods obtained starting from ready-to-use mixtures

    Science.gov (United States)

    Bonifazi, Giuseppe; Castaldi, Federica

    2003-05-01

    Concrete materials obtained from the utilization of pre-mixed and ready to use products (central mix-concrete) are more and more used. They represent a big portion of the civil construction market. Such products are used at different scale, ranging from small scale works, as those commonly realized inside and house, an apartment, etc. or at big civil or industrial scale works. In both cases the problem to control the mixtures and the final work is usually realized through the analysis of properly collected samples. Through appropriate sampling it can be derived objective parameters, as size class distribution and composition of the constituting particulate matter, or mechanical characteristics of the sample itself. An important parameter not considered by the previous mentioned approach is "segregation", that is the possibility that some particulate materials migrate preferentially in some zones of the mixtures and/or of the final product. Such a behavior dramatically influences the quality of the product and of the final manufactured good. Actually this behavior is only studied adopting a human based visual approach. Not repeatable analytical procedures or quantitative data processing exist. In this paper a procedure fully based on image processing techniques is described and applied. Results are presented and analyzed with reference to industrial products. A comparison is also made between the new proposed digital imaging based techniques and the analyses usually carried out at industrial laboratory scale for standard quality control.

  20. Cladistic analysis applied to the classification of volcanoes

    Science.gov (United States)

    Hone, D. W. E.; Mahony, S. H.; Sparks, R. S. J.; Martin, K. T.

    2007-11-01

    Cladistics is a systematic method of classification that groups entities on the basis of sharing similar characteristics in the most parsimonious manner. Here cladistics is applied to the classification of volcanoes using a dataset of 59 Quaternary volcanoes and 129 volcanic edifices of the Tohoku region, Northeast Japan. Volcano and edifice characteristics recorded in the database include attributes of volcano size, chemical composition, dominant eruptive products, volcano morphology, dominant landforms, volcano age and eruptive history. Without characteristics related to time the volcanic edifices divide into two groups, with characters related to volcano size, dominant composition and edifice morphology being the most diagnostic. Analysis including time based characteristics yields four groups with a good correlation between these groups and the two groups from the analysis without time for 108 out of 129 volcanic edifices. Thus when characters are slightly changed the volcanoes still form similar groupings. Analysis of the volcanoes both with and without time yields three groups based on compositional, eruptive products and morphological characters. Spatial clusters of volcanic centres have been recognised in the Tohoku region by Tamura et al. ( Earth Planet Sci Lett 197:105 106, 2002). The groups identified by cladistic analysis are distributed unevenly between the clusters, indicating a tendency for individual clusters to form similar kinds of volcanoes with distinctive but coherent styles of volcanism. Uneven distribution of volcano types between clusters can be explained by variations in dominant magma compositions through time, which are reflected in eruption products and volcanic landforms. Cladistic analysis can be a useful tool for elucidating dynamic igneous processes that could be applied to other regions and globally. Our exploratory study indicates that cladistics has promise as a method for classifying volcanoes and potentially elucidating dynamic

  1. Cross Validation and Discriminative Analysis Techniques in a College Student Attrition Application.

    Science.gov (United States)

    Smith, Alan D.

    1982-01-01

    Used a current attrition study to show the usefulness of discriminative analysis and a cross validation technique applied to student nonpersister questionnaire respondents and nonrespondents. Results of the techniques allowed delineation of several areas of sample under-representation and established the instability of the regression weights…

  2. A dynamic mechanical analysis technique for porous media

    Science.gov (United States)

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  3. Erasing the Milky Way: new cleaning technique applied to GBT intensity mapping data

    CERN Document Server

    Wolz, L; Abdalla, F B; Anderson, C M; Chang, T -C; Li, Y -C; Masui, K W; Switzer, E; Pen, U -L; Voytek, T C; Yadav, J

    2015-01-01

    We present the first application of a new foreground removal pipeline to the current leading HI intensity mapping dataset, obtained by the Green Bank Telescope (GBT). We study the 15hr and 1hr field data of the GBT observations previously presented in Masui et al. (2013) and Switzer et al. (2013) covering about 41 square degrees at 0.6 < z < 1.0 which overlaps with the WiggleZ galaxy survey employed for the cross-correlation with the maps. In the presented pipeline, we subtract the Galactic foreground continuum and the point source contaminations using an independent component analysis technique (fastica) and develop a description for a Fourier-based optimal weighting estimator to compute the temperature power spectrum of the intensity maps and cross-correlation with the galaxy survey data. We show that fastica is a reliable tool to subtract diffuse and point-source emission by using the non-Gaussian nature of their probability functions. The power spectra of the intensity maps and the cross-correlation...

  4. Hyperspectral imaging techniques applied to the monitoring of wine waste anaerobic digestion process

    Science.gov (United States)

    Serranti, Silvia; Fabbri, Andrea; Bonifazi, Giuseppe

    2012-11-01

    An anaerobic digestion process, finalized to biogas production, is characterized by different steps involving the variation of some chemical and physical parameters related to the presence of specific biomasses as: pH, chemical oxygen demand (COD), volatile solids, nitrate (NO3-) and phosphate (PO3-). A correct process characterization requires a periodical sampling of the organic mixture in the reactor and a further analysis of the samples by traditional chemical-physical methods. Such an approach is discontinuous, time-consuming and expensive. A new analytical approach based on hyperspectral imaging in the NIR field (1000 to 1700 nm) is investigated and critically evaluated, with reference to the monitoring of wine waste anaerobic digestion process. The application of the proposed technique was addressed to identify and demonstrate the correlation existing, in terms of quality and reliability of the results, between "classical" chemical-physical parameters and spectral features of the digestate samples. Good results were obtained, ranging from a R2=0.68 and a RMSECV=12.83 mg/l for nitrate to a R2=0.90 and a RMSECV=5495.16 mg O2/l for COD. The proposed approach seems very useful in setting up innovative control strategies allowing for full, continuous control of the anaerobic digestion process.

  5. Digital Image Correlation Techniques Applied to Large Scale Rocket Engine Testing

    Science.gov (United States)

    Gradl, Paul R.

    2016-01-01

    Rocket engine hot-fire ground testing is necessary to understand component performance, reliability and engine system interactions during development. The J-2X upper stage engine completed a series of developmental hot-fire tests that derived performance of the engine and components, validated analytical models and provided the necessary data to identify where design changes, process improvements and technology development were needed. The J-2X development engines were heavily instrumented to provide the data necessary to support these activities which enabled the team to investigate any anomalies experienced during the test program. This paper describes the development of an optical digital image correlation technique to augment the data provided by traditional strain gauges which are prone to debonding at elevated temperatures and limited to localized measurements. The feasibility of this optical measurement system was demonstrated during full scale hot-fire testing of J-2X, during which a digital image correlation system, incorporating a pair of high speed cameras to measure three-dimensional, real-time displacements and strains was installed and operated under the extreme environments present on the test stand. The camera and facility setup, pre-test calibrations, data collection, hot-fire test data collection and post-test analysis and results are presented in this paper.

  6. Laser granulometry: A comparative study the techniques of sieving and elutriation applied to pozzoianic materials

    Directory of Open Access Journals (Sweden)

    Frías, M.

    1990-03-01

    Full Text Available Laser granulometry is a rapid method for determination of particle size distribution in both dry and wet phases. The present paper, diffraction technique by laser beams is an application to the granulometric studies of pozzolanic materials in suspension. Theses granulometric analysis are compared to those obtained with the Alpine pneumatic-siever and Bahco elutriator-centrifuge.

    La granulometria laser es un método rápido para determinar distribuciones de tamaños de partícula tanto en vía seca como en húmeda. En este trabajo la técnica de difracción por rayos laser se aplica al estudio granulométrico de materiales puzolánicos en suspensión. Estos análisis granulométricos se cotejan con los obtenidos con la técnica tamizador-neumático Alpine y elutriador-centrifugador Bahco.

  7. LAMQS analysis applied to ancient Egyptian bronze coins

    International Nuclear Information System (INIS)

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  8. LAMQS analysis applied to ancient Egyptian bronze coins

    Energy Technology Data Exchange (ETDEWEB)

    Torrisi, L., E-mail: lorenzo.torrisi@unime.i [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caridi, F.; Giuffrida, L.; Torrisi, A. [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Mondio, G.; Serafino, T. [Dipartimento di Fisica della Materia ed Ingegneria Elettronica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caltabiano, M.; Castrizio, E.D. [Dipartimento di Lettere e Filosofia dell' Universita di Messina, Polo Universitario dell' Annunziata, 98168 Messina (Italy); Paniz, E.; Salici, A. [Carabinieri, Reparto Investigazioni Scientifiche, S.S. 114, Km. 6, 400 Tremestieri, Messina (Italy)

    2010-05-15

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  9. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Baek, Sung Ryel; Kim, Young Gi; Jung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun; Lim, Jong Myoung

    2003-05-01

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology.

  10. Applied research and development of neutron activation analysis

    International Nuclear Information System (INIS)

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology

  11. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.;

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  12. Empirical modal decomposition applied to cardiac signals analysis

    Science.gov (United States)

    Beya, O.; Jalil, B.; Fauvet, E.; Laligant, O.

    2010-01-01

    In this article, we present the method of empirical modal decomposition (EMD) applied to the electrocardiograms and phonocardiograms signals analysis and denoising. The objective of this work is to detect automatically cardiac anomalies of a patient. As these anomalies are localized in time, therefore the localization of all the events should be preserved precisely. The methods based on the Fourier Transform (TFD) lose the localization property [13] and in the case of Wavelet Transform (WT) which makes possible to overcome the problem of localization, but the interpretation remains still difficult to characterize the signal precisely. In this work we propose to apply the EMD (Empirical Modal Decomposition) which have very significant properties on pseudo periodic signals. The second section describes the algorithm of EMD. In the third part we present the result obtained on Phonocardiograms (PCG) and on Electrocardiograms (ECG) test signals. The analysis and the interpretation of these signals are given in this same section. Finally, we introduce an adaptation of the EMD algorithm which seems to be very efficient for denoising.

  13. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  14. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author)

  15. Time-reversal imaging techniques applied to tremor waveforms near Cholame, California to locate tectonic tremor

    Science.gov (United States)

    Horstmann, T.; Harrington, R. M.; Cochran, E. S.

    2012-12-01

    Frequently, the lack of distinctive phase arrivals makes locating tectonic tremor more challenging than locating earthquakes. Classic location algorithms based on travel times cannot be directly applied because impulsive phase arrivals are often difficult to recognize. Traditional location algorithms are often modified to use phase arrivals identified from stacks of recurring low-frequency events (LFEs) observed within tremor episodes, rather than single events. Stacking the LFE waveforms improves the signal-to-noise ratio for the otherwise non-distinct phase arrivals. In this study, we apply a different method to locate tectonic tremor: a modified time-reversal imaging approach that potentially exploits the information from the entire tremor waveform instead of phase arrivals from individual LFEs. Time reversal imaging uses the waveforms of a given seismic source recorded by multiple seismometers at discrete points on the surface and a 3D velocity model to rebroadcast the waveforms back into the medium to identify the seismic source location. In practice, the method works by reversing the seismograms recorded at each of the stations in time, and back-propagating them from the receiver location individually into the sub-surface as a new source time function. We use a staggered-grid, finite-difference code with 2.5 ms time steps and a grid node spacing of 50 m to compute the rebroadcast wavefield. We calculate the time-dependent curl field at each grid point of the model volume for each back-propagated seismogram. To locate the tremor, we assume that the source time function back-propagated from each individual station produces a similar curl field at the source position. We then cross-correlate the time dependent curl field functions and calculate a median cross-correlation coefficient at each grid point. The highest median cross-correlation coefficient in the model volume is expected to represent the source location. For our analysis, we use the velocity model of

  16. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  17. Applying morphologic techniques to evaluate hotdogs: what is in the hotdogs we eat?

    Science.gov (United States)

    Prayson, Brigid E; McMahon, James T; Prayson, Richard A

    2008-04-01

    Americans consume billions of hotdogs per year resulting in more than a billion dollars in retail sales. Package labels typically list some type of meat as the primary ingredient. The purpose of this study is to assess the meat and water content of several hotdog brands to determine if the package labels are accurate. Eight brands of hotdogs were evaluated for water content by weight. A variety of routine techniques in surgical pathology including routine light microscopy with hematoxylin-eosin-stained sections, special staining, immunohistochemistry, and electron microscopy were used to assess for meat content and for other recognizable components. Package labels indicated that the top-listed ingredient in all 8 brands was meat; the second listed ingredient was water (n = 6) and another type of meat (n = 2). Water comprised 44% to 69% (median, 57%) of the total weight. Meat content determined by microscopic cross-section analysis ranged from 2.9% to 21.2% (median, 5.7%). The cost per hotdog ($0.12-$0.42) roughly correlated with meat content. A variety of tissues were observed besides skeletal muscle including bone (n = 8), collagen (n = 8), blood vessels (n = 8), plant material (n = 8), peripheral nerve (n = 7), adipose (n = 5), cartilage (n = 4), and skin (n = 1). Glial fibrillary acidic protein immunostaining was not observed in any of the hotdogs. Lipid content on oil red O staining was graded as moderate in 3 hotdogs and marked in 5 hotdogs. Electron microscopy showed recognizable skeletal muscle with evidence of degenerative changes. In conclusion, hotdog ingredient labels are misleading; most brands are more than 50% water by weight. The amount of meat (skeletal muscle) in most brands comprised less than 10% of the cross-sectional surface area. More expensive brands generally had more meat. All hotdogs contained other tissue types (bone and cartilage) not related to skeletal muscle; brain tissue was not present. PMID:18325469

  18. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  19. A multiblock grid generation technique applied to a jet engine configuration

    Science.gov (United States)

    Stewart, Mark E. M.

    1992-01-01

    Techniques are presented for quickly finding a multiblock grid for a 2D geometrically complex domain from geometrical boundary data. An automated technique for determining a block decomposition of the domain is explained. Techniques for representing this domain decomposition and transforming it are also presented. Further, a linear optimization method may be used to solve the equations which determine grid dimensions within the block decomposition. These algorithms automate many stages in the domain decomposition and grid formation process and limit the need for human intervention and inputs. They are demonstrated for the meridional or throughflow geometry of a bladed jet engine configuration.

  20. Calorimetric techniques applied to the thermodynamic study of interactions between proteins and polysaccharides

    Directory of Open Access Journals (Sweden)

    Monique Barreto Santos

    2016-08-01

    Full Text Available ABSTRACT: The interactions between biological macromolecules have been important for biotechnology, but further understanding is needed to maximize the utility of these interactions. Calorimetric techniques provide information regarding these interactions through the thermal energy that is produced or consumed during interactions. Notable techniques include differential scanning calorimetry, which generates a thermodynamic profile from temperature scanning, and isothermal titration calorimetry that provide the thermodynamic parameters directly related to the interaction. This review described how calorimetric techniques can be used to study interactions between proteins and polysaccharides, and provided valuable insight into the thermodynamics of their interaction.

  1. Quantitative thoracic CT techniques in adults: can they be applied in the pediatric population?

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Soon Ho [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Goo, Jin Mo [Seoul National University Medical Research Center, Department of Radiology, Seoul National University College of Medicine, and Institute of Radiation Medicine, Seoul (Korea, Republic of); Seoul National University College of Medicine, Cancer Research Institute, Jongno-gu, Seoul (Korea, Republic of); Goo, Hyun Woo [University of Ulsan College of Medicine, Department of Radiology and Research Institute of Radiology, Asan Medical Center, Seoul (Korea, Republic of)

    2013-03-15

    With the rapid evolution of the multidetector row CT technique, quantitative CT has started to be used in clinical studies for revealing a heterogeneous entity of airflow limitation in chronic obstructive pulmonary disease that is caused by a combination of lung parenchymal destruction and remodeling of the small airways in adults. There is growing evidence of a good correlation between quantitative CT findings and pathological findings, pulmonary function test results and other clinical parameters. This article provides an overview of current quantitative thoracic CT techniques used in adults, and how to translate these CT techniques to the pediatric population. (orig.)

  2. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  3. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  4. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  5. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  6. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  7. Downside Risk analysis applied to Hedge Funds universe

    CERN Document Server

    Perello, J

    2006-01-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires a high precision risk evaluation and an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater (or lower) than investor's goal. We study several risk indicators using the Gaussian case as a benchmark and apply them to the Credit Suisse/Tremont Investable Hedge Fund Index Data.

  8. Evaluating the effectiveness of teacher training in Applied Behaviour Analysis.

    Science.gov (United States)

    Grey, Ian M; Honan, Rita; McClean, Brian; Daly, Michael

    2005-09-01

    Interventions for children with autism based upon Applied Behaviour Analysis (ABA) has been repeatedly shown to be related both to educational gains and to reductions in challenging behaviours. However, to date, comprehensive training in ABA for teachers and others have been limited. Over 7 months, 11 teachers undertook 90 hours of classroom instruction and supervision in ABA. Each teacher conducted a comprehensive functional assessment and designed a behaviour support plan targeting one behaviour for one child with an autistic disorder. Target behaviours included aggression, non-compliance and specific educational skills. Teachers recorded observational data for the target behaviour for both baseline and intervention sessions. Support plans produced an average 80 percent change in frequency of occurrence of target behaviours. Questionnaires completed by parents and teachers at the end of the course indicated a beneficial effect for the children and the educational environment. The potential benefits of teacher implemented behavioural intervention are discussed. PMID:16144826

  9. Applied analysis of recurrent events: a practical overview

    NARCIS (Netherlands)

    Twisk, J.W.R.; Smidt, N.; Vente, de W.

    2005-01-01

    STUDY OBJECTIVE: The purpose of this paper is to give an overview and comparison of different easily applicable statistical techniques to analyse recurrent event data. SETTING: These techniques include naive techniques and longitudinal techniques such as Cox regression for recurrent events, generali

  10. Nde of Advanced Automotive Composite Materials that Apply Ultrasound Infrared Thermography Technique

    Science.gov (United States)

    Choi, Seung-Hyun; Park, Soo-Keun; Kim, Jae-Yeol

    The infrared thermographic nondestructive inspection technique is a quality inspection and stability assessment method used to diagnose the physical characteristics and defects by detecting the infrared ray radiated from the object without destructing it. Recently, the nondestructive inspection and assessment that use the ultrasound-infrared thermography technique are widely adopted in diverse areas. The ultrasound-infrared thermography technique uses the phenomenon that the ultrasound wave incidence to an object with cracks or defects on its mating surface generates local heat on the surface. The car industry increasingly uses composite materials for their lightweight, strength, and environmental resistance. In this study, the car piston passed through the ultrasound-infrared thermography technique for nondestructive testing, among the composite material car parts. This study also examined the effects of the frequency and power to optimize the nondestructive inspection.

  11. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  12. Flipped parameter technique applied on source localization in energy constraint sensor arrays

    OpenAIRE

    Pavlović Vlastimir D.; Veličković Zoran S.

    2009-01-01

    In this paper novel flipped parameter technique (FPT) for time delay estimation (TDE) in source localization problem is described. We propose passive source localization technique based on the development of an energy efficient algorithm that can reduce intersensor and interarray communication. We propose a flipped parameter (FP) which can be defined for any sensor in distributed sensor subarrays during the observation period. Unlike classical TDE methods that evaluate cross-correlation funct...

  13. Measurement of the magnitude of force applied by students when learning a mobilisation technique

    OpenAIRE

    Smit, E.; Conradie, M; Wessels, J.; I. Witbooi; Otto, R.

    2003-01-01

    Passive accessory intervertebral movements (PAIVM’s) are frequently used by physiotherapists in the  assessment and management of patients. Studies investigating the reliability of passive mobilisation techniques have shown conflicting results. Therefore, standardisation of PAIVM’s is essential for research and teaching purposes, which could result in better clinical management. In order to standardise graded passive mobilisation techniques, a reliable, easy-to-use, objective measurement tool...

  14. Using Link Analysis Technique with a Modified Shortest-Path Algorithm to Fight Money Laundering

    Institute of Scientific and Technical Information of China (English)

    CHEN Yunkai; MAI Quanwen; LU Zhengding

    2006-01-01

    Effective link analysis techniques are needed to help law enforcement and intelligence agencies fight money laundering.This paper presents a link analysis technique that uses a modified shortest-path algorithms to identify the strongest association paths between entities in a money laundering network.Based on two-tree Dijkstra and Priority-First-Search (PFS) algorithm, a modified algorithm is presented.To apply the algorithm, a network representation transformation is made first.

  15. Validation of Design and Analysis Techniques of Tailored Composite Structures

    Science.gov (United States)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  16. Performance values for non destructive assay (NDA) techniques applied to safeguards: the 2002 evaluation by the ESARDA NDA Working Group

    International Nuclear Information System (INIS)

    The first evaluation of NDA performance values undertaken by the ESARDA Working Group for Standards and Non Destructive Assay Techniques (WGNDA) was published in 1993. Almost 10 years later the Working Group decided to review those values, to report about improvements and to issue new performance values for techniques which were not applied in the early nineties, or were at that time only emerging. Non-Destructive Assay techniques have become more and more important in recent years, and they are used to a large extent in nuclear material accountancy and control both by operators and control authorities. As a consequence, the performance evaluation for NDA techniques is of particular relevance to safeguards authorities in optimising Safeguards operations and reducing costs. Performance values are important also for NMAC regulators, to define detection levels, limits for anomalies, goal quantities and to negotiate basic audit rules. This paper presents the latest evaluation of ESARDA Performance Values (EPVs) for the most common NDA techniques currently used for the assay of nuclear materials for Safeguards purposes. The main topics covered by the document are: techniques for plutonium bearing materials: PuO2 and MOX; techniques for U-bearing materials; techniques for U and Pu in liquid form; techniques for spent fuel assay. This issue of the performance values is the result of specific international round robin exercises, field measurements and ad hoc experiments, evaluated and discussed in the ESARDA NDA Working Group. (author)

  17. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    Directory of Open Access Journals (Sweden)

    Árpád Gyéresi

    2013-02-01

    Full Text Available Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  18. Feasibility to apply the steam assisted gravity drainage (SAGD) technique in the country's heavy crude-oil fields

    International Nuclear Information System (INIS)

    The steam assisted gravity drainage (SAGD) processes are one of the most efficient and profitable technologies for the production of heavy crude oils and oil sands. These processes involve the drilling of a couple of parallel horizontal wells, separated by a vertical distance and located near the oil field base. The upper well is used to continuously inject steam into the zone of interest, while the lower well collects all resulting fluids (oil, condensate and formation water) and takes them to the surface (Butler, 1994). This technology has been successfully implemented in countries such as Canada, Venezuela and United States, reaching recovery factors in excess of 50%. This article provides an overview of the technique's operation mechanism and the process most relevant characteristics, as well as the various categories this technology is divided into, including all its advantages and limitations. Furthermore, the article sets the oil field's minimal conditions under which the SAGD process is efficient, which conditions, as integrated to a series of mathematical models, allow to make forecasts on production, thermal efficiency (ODR) and oil to be recovered, as long as it is feasible (from a technical point of view) to apply this technique to a defined oil field. The information and concepts compiled during this research prompted the development of software, which may be used as an information, analysis and interpretation tool to predict and quantify this technology's performance. Based on the article, preliminary studies were started for the country's heavy crude-oil fields, identifying which provide the minimum conditions for the successful development of a pilot project

  19. Applying Conjoint Analysis to Study Attitudes of Thai Government Organisations

    Directory of Open Access Journals (Sweden)

    Natee Suriyanon

    2012-11-01

    Full Text Available This article presents the application of choice-based conjointanalysis to analyse the attitude of Thai government organisationstowards the restriction of the contractor’s right to claimcompensation for unfavourable effects from undesirable events.The analysis reveals that the organisations want to restrict only 6out of 14 types of the claiming rights that were studied. The rightthat they want to restrict most is the right to claim for additionaldirect costs due to force majeure. They are willing to pay between0.087% - 0.210% of the total project direct cost for restricting eachtype of contractor right. The total additional cost for restrictingall six types of rights that the organisations are willing to pay is0.882%. The last section of this article applies the knowledgegained from a choice based conjoint analysis experiment to theanalysis of the standard contract of the Thai government. Theanalysis reveals three types of rights where Thai governmentorganisations are willing to forego restrictions, but the presentstandard contract does not grant such rights.

  20. Analysis of Jugular Foramen Exposure in the Fallopian Bridge Technique

    OpenAIRE

    Satar, Bulent; Yazar, Fatih; Aykut CEYHAN; Arslan, Hasan Huseyin; Aydin, Sedat

    2009-01-01

    Objective: To analyze the exposure of the jugular foramen afforded by the fallopian bridge technique. Method: The jugular foramen exposure was obtained using the jugular foramen approach combined with the fallopian bridge technique. We applied this technique using 10 temporal bone specimens at a tertiary referral center. The exposure was assessed by means of depth of the dissection field and two separate dissection spaces that were created anteriorly and posteriorly to the facial nerve. Anter...

  1. Statistical learning techniques applied to epidemiology: a simulated case-control comparison study with logistic regression

    Directory of Open Access Journals (Sweden)

    Land Walker H

    2011-01-01

    Full Text Available Abstract Background When investigating covariate interactions and group associations with standard regression analyses, the relationship between the response variable and exposure may be difficult to characterize. When the relationship is nonlinear, linear modeling techniques do not capture the nonlinear information content. Statistical learning (SL techniques with kernels are capable of addressing nonlinear problems without making parametric assumptions. However, these techniques do not produce findings relevant for epidemiologic interpretations. A simulated case-control study was used to contrast the information embedding characteristics and separation boundaries produced by a specific SL technique with logistic regression (LR modeling representing a parametric approach. The SL technique was comprised of a kernel mapping in combination with a perceptron neural network. Because the LR model has an important epidemiologic interpretation, the SL method was modified to produce the analogous interpretation and generate odds ratios for comparison. Results The SL approach is capable of generating odds ratios for main effects and risk factor interactions that better capture nonlinear relationships between exposure variables and outcome in comparison with LR. Conclusions The integration of SL methods in epidemiology may improve both the understanding and interpretation of complex exposure/disease relationships.

  2. Flipped parameter technique applied on source localization in energy constraint sensor arrays

    Directory of Open Access Journals (Sweden)

    Pavlović Vlastimir D.

    2009-01-01

    Full Text Available In this paper novel flipped parameter technique (FPT for time delay estimation (TDE in source localization problem is described. We propose passive source localization technique based on the development of an energy efficient algorithm that can reduce intersensor and interarray communication. We propose a flipped parameter (FP which can be defined for any sensor in distributed sensor subarrays during the observation period. Unlike classical TDE methods that evaluate cross-correlation function, FPT requires evaluation based upon single sensor signal. The computed cross correlation between a signal and its analytic 'flipped' pair (flipped correlation is a smooth function which peak (time delay can be accurately detected. Flipped parameters are sufficient to determine all differential delays of the signals related to the same source. The flipped parameter technique can be used successfully in two-step methods of passive source localization with significantly less energy in comparison to the classic cross correlation. The use of FPT method is especially significant for the energy constrain distributed sensor subarrays. Using synthetic seismic signals, we illustrate the error of the source localization for classical and proposed method in the presence of noise. We demonstrate the performance improvement in noise environment of the proposed technique in comparison to the classic methods that use real signals. The proposed technique gives accurate results for both coherent and non-coherent signals.

  3. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  4. Digital filtering techniques applied to electric power systems protection; Tecnicas de filtragem digital aplicadas a protecao de sistemas eletricos de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Brito, Helio Glauco Ferreira

    1996-12-31

    This work introduces an analysis and a comparative study of some of the techniques for digital filtering of the voltage and current waveforms from faulted transmission lines. This study is of fundamental importance for the development of algorithms applied to digital protection of electric power systems. The techniques studied are based on the Discrete Fourier Transform theory, the Walsh functions and the Kalman filter theory. Two aspects were emphasized in this study: Firstly, the non-recursive techniques were analysed with the implementation of filters based on Fourier theory and the Walsh functions. Secondly, recursive techniques were analyzed, with the implementation of the filters based on the Kalman theory and once more on the Fourier theory. (author) 56 refs., 25 figs., 16 tabs.

  5. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    CERN Document Server

    von Hippel, Ted

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants a...

  6. UQ and V&V techniques applied to experiments and simulations of heated pipes pressurized to failure.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose; Dempsey, J. Franklin; Antoun, Bonnie R.

    2014-05-01

    This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.

  7. Reformulation linearization technique based branch-and-reduce approach applied to regional water supply system planning

    Science.gov (United States)

    Lan, Fujun; Bayraksan, Güzin; Lansey, Kevin

    2016-03-01

    A regional water supply system design problem that determines pipe and pump design parameters and water flows over a multi-year planning horizon is considered. A non-convex nonlinear model is formulated and solved by a branch-and-reduce global optimization approach. The lower bounding problem is constructed via a three-pronged effort that involves transforming the space of certain decision variables, polyhedral outer approximations, and the Reformulation Linearization Technique (RLT). Range reduction techniques are employed systematically to speed up convergence. Computational results demonstrate the efficiency of the proposed algorithm; in particular, the critical role range reduction techniques could play in RLT based branch-and-bound methods. Results also indicate using reclaimed water not only saves freshwater sources but is also a cost-effective non-potable water source in arid regions. Supplemental data for this article can be accessed at http://dx.doi.org/10.1080/0305215X.2015.1016508.

  8. New Control Technique Applied in Dynamic Voltage Restorer for Voltage Sag Mitigation

    Directory of Open Access Journals (Sweden)

    Rosli Omar

    2010-01-01

    Full Text Available The Dynamic Voltage Restorer (DVR was a power electronics device that was able to compensate voltage sags on critical loads dynamically. The DVR consists of VSC, injection transformers, passive filters and energy storage (lead acid battery. By injecting an appropriate voltage, the DVR restores a voltage waveform and ensures constant load voltage. There were so many types of the control techniques being used in DVR for mitigating voltage sags. The efficiency of the DVR depends on the efficiency of the control technique involved in switching the inverter. Problem statement: Simulation and experimental investigation toward new algorithms development based on SVPWM. Understanding the nature of DVR and performance comparisons between the various controller technologies available. The proposed controller using space vector modulation techniques obtain higher amplitude modulation indexes if compared with conventional SPWM techniques. Moreover, space vector modulation techniques can be easily implemented using digital processors. Space vector PWM can produce about 15% higher output voltage than standard Sinusoidal PWM. Approach: The purpose of this research was to study the implementation of SVPWM in DVR. The proposed control algorithm was investigated through computer simulation by using PSCAD/EMTDC software. Results: From simulation and experimental results showed the effectiveness and efficiency of the proposed controller based on SVPWM in mitigating voltage sags in low voltage distribution systems. It was concluded that its controller also works well both in balance and unbalance conditions of voltages. Conclusion/Recommendations: The simulation and experimental results of a DVR using PSCAD/EMTDC software based on SVPWM technique showed clearly the performance of the DVR in mitigating voltage sags. The DVR operates without any difficulties to inject the appropriate voltage component to correct rapidly any anomaly in the supply voltage to keep the

  9. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  10. Effect of the reinforcement bar arrangement on the efficiency of electrochemical chloride removal technique applied to reinforced concrete structures

    Energy Technology Data Exchange (ETDEWEB)

    Garces, P. [Dpto. Ing. de la Construccion, Obras Publicas e Infraestructura Urbana, Universidad de Alicante. Apdo. 99, E-03080 Alicante (Spain)]. E-mail: pedro.garces@ua.es; Sanchez de Rojas, M.J. [Dpto. Ing. de la Construccion, Obras Publicas e Infraestructura Urbana, Universidad de Alicante. Apdo. 99, E-03080 Alicante (Spain); Climent, M.A. [Dpto. Ing. de la Construccion, Obras Publicas e Infraestructura Urbana, Universidad de Alicante. Apdo. 99, E-03080 Alicante (Spain)

    2006-03-15

    This paper reports on the research done to find out the effect that different bar arrangements may have on the efficiency of the electrochemical chloride removal (ECR) technique when applied to a reinforced concrete structural member. Five different types of bar arrangements were considered, corresponding to typical structural members such as columns (with single and double bar reinforcing), slabs, beams and footings. ECR was applied in several steps. We observe that the extraction efficiency depends on the reinforcing bar arrangement. A uniform layer set-up favours chloride extraction. Electrochemical techniques were also used to estimate the reinforcing bar corrosion states, as well as measure the corrosion potential, and instant corrosion rate based on the polarization resistance technique. After ECR treatment, a reduction in the corrosion levels is observed falling short of the depassivation threshold.

  11. U P1, an example for advanced techniques applied to high level activity dismantling

    International Nuclear Information System (INIS)

    The U P1 plant on the CEA Marcoule site was dedicated to the processing of spend fuels from the G1, G2 and G3 plutonium-producing reactors. This plant represents 20.000 m2 of workshops housing about 1000 hot cells. In 1998, a huge program for the dismantling and cleaning-up of the UP1 plant was launched. CEA has developed new techniques to face the complexity of the dismantling operations. These techniques include immersive virtual reality, laser cutting, a specific manipulator arm called MAESTRO and remote handling. (A.C.)

  12. Applied predictive analytics principles and techniques for the professional data analyst

    CERN Document Server

    Abbott, Dean

    2014-01-01

    Learn the art and science of predictive analytics - techniques that get results Predictive analytics is what translates big data into meaningful, usable business information. Written by a leading expert in the field, this guide examines the science of the underlying algorithms as well as the principles and best practices that govern the art of predictive analytics. It clearly explains the theory behind predictive analytics, teaches the methods, principles, and techniques for conducting predictive analytics projects, and offers tips and tricks that are essential for successful predictive mode

  13. Development of Characterization Techniques of Thermodynamic and Physical Properties Applied to the CO2-DMSO Mixture

    OpenAIRE

    Calvignac, Brice; Rodier, Elisabeth; Letourneau, Jean-Jacques; Fages, Jacques

    2009-01-01

    International audience This work is focused on the development of new characterization techniques of physical and thermodynamic properties. These techniques have been validated using the binary system DMSO-CO2 for which several studies of characterization have been well documented. We focused on the DMSO-rich phase and we carried out measurements of volumetric expansion, density, viscosity and CO2 solubility at 298.15, 308.15 and 313.15 K and pressures up to 9 MPa. The experimental procedu...

  14. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.

  15. Common cause evaluations in applied risk analysis of nuclear power plants

    International Nuclear Information System (INIS)

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system

  16. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    Science.gov (United States)

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  17. X-ray micro-beam techniques and phase contrast tomography applied to biomaterials

    Science.gov (United States)

    Fratini, Michela; Campi, Gaetano; Bukreeva, Inna; Pelliccia, Daniele; Burghammer, Manfred; Tromba, Giuliana; Cancedda, Ranieri; Mastrogiacomo, Maddalena; Cedola, Alessia

    2015-12-01

    A deeper comprehension of the biomineralization (BM) process is at the basis of tissue engineering and regenerative medicine developments. Several in-vivo and in-vitro studies were dedicated to this purpose via the application of 2D and 3D diagnostic techniques. Here, we develop a new methodology, based on different complementary experimental techniques (X-ray phase contrast tomography, micro-X-ray diffraction and micro-X-ray fluorescence scanning technique) coupled to new analytical tools. A qualitative and quantitative structural investigation, from the atomic to the micrometric length scale, is obtained for engineered bone tissues. The high spatial resolution achieved by X-ray scanning techniques allows us to monitor the bone formation at the first-formed mineral deposit at the organic-mineral interface within a porous scaffold. This work aims at providing a full comprehension of the morphology and functionality of the biomineralization process, which is of key importance for developing new drugs for preventing and healing bone diseases and for the development of bio-inspired materials.

  18. Study of Phase Reconstruction Techniques applied to Smith-Purcell Radiation Measurements

    CERN Document Server

    Delerue, Nicolas; Vieille-Grosjean, Mélissa; Bezshyyko, Oleg; Khodnevych, Vitalii

    2014-01-01

    Measurements of coherent radiation at accelerators typically give the absolute value of the beam profile Fourier transform but not its phase. Phase reconstruction techniques such as Hilbert transform or Kramers Kronig reconstruction are used to recover such phase. We report a study of the performances of these methods and how to optimize the reconstructed profiles.

  19. Study of Phase Reconstruction Techniques applied to Smith-Purcell Radiation Measurements

    CERN Document Server

    Delerue, Nicolas; Bezshyyko, Oleg; Khodnevych, Vitalii

    2015-01-01

    Measurements of coherent radiation at accelerators typically give the absolute value of the beam profile Fourier transform but not its phase. Phase reconstruction techniques such as Hilbert transform or Kramers Kronig reconstruction are used to recover such phase. We report a study of the performances of these methods and how to optimize the reconstructed profiles.

  20. Do trained practice nurses apply motivational interviewing techniques in primary care consultations?

    NARCIS (Netherlands)

    Noordman, J.; Lee, I. van der; Nielen, M.; Vlek, H.; Weijden, T. van der; Dulmen, S. van

    2012-01-01

    Background: Reducing the prevalence of unhealthy lifestyle behaviour could positively influence health. Motivational interviewing (MI) is used to promote change in unhealthy lifestyle behaviour as part of primary or secondary prevention. Whether MI is actually applied as taught is unknown. Practice

  1. Do trained practice nurses apply motivational interviewing techniques in primary care consultations?

    NARCIS (Netherlands)

    Noordman, J.; Lee, I. van; Nielen, M.; Vlek, H.; Weijden, T. van; Dulmen, S. van

    2012-01-01

    BACKGROUND: Reducing the prevalence of unhealthy lifestyle behaviour could positively influence health. Motivational interviewing (MI) is used to promote change in unhealthy lifestyle behaviour as part of primary or secondary prevention. Whether MI is actually applied as taught is unknown. Practice

  2. Applying the Management-by-Objectives Technique in an Industrial Library

    Science.gov (United States)

    Stanton, Robert O.

    1975-01-01

    An experimental "management-by-objectives" performance system was operated by the Libraries and Information Systems Center of Bell Laboratories during 1973. It was found that, though the system was very effective for work planning and the development of people, difficulties were encountered in applying it to certain classes of employees. (Author)

  3. Time-lapse motion picture technique applied to the study of geological processes

    Science.gov (United States)

    Miller, R.D.; Crandell, D.R.

    1959-01-01

    Light-weight, battery-operated timers were built and coupled to 16-mm motion-picture cameras having apertures controlled by photoelectric cells. The cameras were placed adjacent to Emmons Glacier on Mount Rainier. The film obtained confirms the view that exterior time-lapse photography can be applied to the study of slow-acting geologic processes.

  4. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  5. Correlation network analysis applied to complex biofilm communities.

    Directory of Open Access Journals (Sweden)

    Ana E Duran-Pinedo

    Full Text Available The complexity of the human microbiome makes it difficult to reveal organizational principles of the community and even more challenging to generate testable hypotheses. It has been suggested that in the gut microbiome species such as Bacteroides thetaiotaomicron are keystone in maintaining the stability and functional adaptability of the microbial community. In this study, we investigate the interspecies associations in a complex microbial biofilm applying systems biology principles. Using correlation network analysis we identified bacterial modules that represent important microbial associations within the oral community. We used dental plaque as a model community because of its high diversity and the well known species-species interactions that are common in the oral biofilm. We analyzed samples from healthy individuals as well as from patients with periodontitis, a polymicrobial disease. Using results obtained by checkerboard hybridization on cultivable bacteria we identified modules that correlated well with microbial complexes previously described. Furthermore, we extended our analysis using the Human Oral Microbe Identification Microarray (HOMIM, which includes a large number of bacterial species, among them uncultivated organisms present in the mouth. Two distinct microbial communities appeared in healthy individuals while there was one major type in disease. Bacterial modules in all communities did not overlap, indicating that bacteria were able to effectively re-associate with new partners depending on the environmental conditions. We then identified hubs that could act as keystone species in the bacterial modules. Based on those results we then cultured a not-yet-cultivated microorganism, Tannerella sp. OT286 (clone BU063. After two rounds of enrichment by a selected helper (Prevotella oris OT311 we obtained colonies of Tannerella sp. OT286 growing on blood agar plates. This system-level approach would open the possibility of

  6. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Directory of Open Access Journals (Sweden)

    Ted von Hippel

    Full Text Available We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  7. Fractographic principles applied to Y-TZP mechanical behavior analysis.

    Science.gov (United States)

    Ramos, Carla Müller; Cesar, Paulo Francisco; Bonfante, Estevam Augusto; Rubo, José Henrique; Wang, Linda; Borges, Ana Flávia Sanches

    2016-04-01

    The purpose of this study was to evaluate the use of fractography principles to determine the fracture toughness of Y-TZP dental ceramic in which KIc was measured fractographically using controlled-flaw beam bending techniques and to correlate the flaw distribution with the mechanical properties. The Y-TZP blocks studied were: Zirconia Zirklein (ZZ); Zirconcad (ZCA); IPS e.max ZirCad (ZMAX); and In Ceram YZ (ZYZ). Samples were prepared (16mm×4mm×2mm) according to ISO 6872 specifications and subjected to three-point bending at a crosshead speed of 0.5mm/min. Weibull probability curves (95% confidence bounds) were calculated and a contour plot with the Weibull modulus (m) versus characteristic strength (σ0) was used to examine the differences among groups. The fractured surface of each specimen was inspected in a scanning electron microscope (SEM) for qualitative and quantitative fractographic analysis. The critical defect size (c) and fracture toughness (KIc) were estimated. The fractured surfaces of the samples from all groups showed similar fractographic characteristics, except ZCA showed pores and defects. Fracture toughness and the flexural strength values were not different among the groups except for ZCA. The characteristic strength (pzirconia polycrystalline ceramics. PMID:26722988

  8. Magnetic resonance techniques applied to the diagnosis and treatment of Parkinson’s disease

    Directory of Open Access Journals (Sweden)

    Benito eDe Celis Alonso

    2015-07-01

    Full Text Available Parkinson’s disease affects at least 10 million people worldwide. It is a neurodegenerative disease which is currently diagnosed by neurological examination. No neuroimaging investigation or blood biomarker is available to aid diagnosis and prognosis. Most effort toward diagnosis using magnetic resonance has been focused on the use of structural/anatomical neuroimaging and diffusion tensor imaging. However, deep brain stimulation, a current strategy for treating Parkinson’s disease, is guided by magnetic resonance imaging. For clinical prognosis, diagnosis and follow-up investigations, blood oxygen level–dependent magnetic resonance imaging, diffusion tensor imaging, spectroscopy and transcranial magnetic stimulation have been used. These techniques represent the state of the art in the last five years. Here, we focus on magnetic resonance techniques for the diagnosis and treatment of Parkinson’s disease.

  9. Modelling laser speckle photographs of decayed teeth by applying a digital image information technique

    Science.gov (United States)

    Ansari, M. Z.; da Silva, L. C.; da Silva, J. V. P.; Deana, A. M.

    2016-09-01

    We report on the application of a digital image model to assess early carious lesions on teeth. When decay is in its early stages, the lesions were illuminated with a laser and the laser speckle images were obtained. Due to the differences in the optical properties between healthy and carious tissue, both regions produced different scatter patterns. The digital image information technique allowed us to produce colour-coded 3D surface plots of the intensity information in the speckle images, where the height (on the z-axis) and the colour in the rendering correlate with the intensity of a pixel in the image. The quantitative changes in colour component density enhance the contrast between the decayed and sound tissue, and visualization of the carious lesions become significantly evident. Therefore, the proposed technique may be adopted in the early diagnosis of carious lesions.

  10. Magnetic Resonance Techniques Applied to the Diagnosis and Treatment of Parkinson’s Disease

    Science.gov (United States)

    de Celis Alonso, Benito; Hidalgo-Tobón, Silvia S.; Menéndez-González, Manuel; Salas-Pacheco, José; Arias-Carrión, Oscar

    2015-01-01

    Parkinson’s disease (PD) affects at least 10 million people worldwide. It is a neurodegenerative disease, which is currently diagnosed by neurological examination. No neuroimaging investigation or blood biomarker is available to aid diagnosis and prognosis. Most effort toward diagnosis using magnetic resonance (MR) has been focused on the use of structural/anatomical neuroimaging and diffusion tensor imaging (DTI). However, deep brain stimulation, a current strategy for treating PD, is guided by MR imaging (MRI). For clinical prognosis, diagnosis, and follow-up investigations, blood oxygen level-dependent MRI, DTI, spectroscopy, and transcranial magnetic stimulation have been used. These techniques represent the state of the art in the last 5 years. Here, we focus on MR techniques for the diagnosis and treatment of Parkinson’s disease. PMID:26191037

  11. Wavelet-based Adaptive Techniques Applied to Turbulent Hypersonic Scramjet Intake Flows

    CERN Document Server

    Frauholz, Sarah; Reinartz, Birgit U; Müller, Siegfried; Behr, Marek

    2013-01-01

    The simulation of hypersonic flows is computationally demanding due to large gradients of the flow variables caused by strong shock waves and thick boundary or shear layers. The resolution of those gradients imposes the use of extremely small cells in the respective regions. Taking turbulence into account intensives the variation in scales even more. Furthermore, hypersonic flows have been shown to be extremely grid sensitive. For the simulation of three-dimensional configurations of engineering applications, this results in a huge amount of cells and prohibitive computational time. Therefore, modern adaptive techniques can provide a gain with respect to computational costs and accuracy, allowing the generation of locally highly resolved flow regions where they are needed and retaining an otherwise smooth distribution. An h-adaptive technique based on wavelets is employed for the solution of hypersonic flows. The compressible Reynolds averaged Navier-Stokes equations are solved using a differential Reynolds s...

  12. The radiation techniques of tomotherapy & intensity-modulated radiation therapy applied to lung cancer

    OpenAIRE

    Zhu, Zhengfei; Fu, Xiaolong

    2015-01-01

    Radiotherapy (RT) plays an important role in the management of lung cancer. Development of radiation techniques is a possible way to improve the effect of RT by reducing toxicities through better sparing the surrounding normal tissues. This article will review the application of two forms of intensity-modulated radiation therapy (IMRT), fixed-field IMRT and helical tomotherapy (HT) in lung cancer, including dosimetric and clinical studies. The advantages and potential disadvantages of these t...

  13. Improving throughput and user experience for information intensive websites by applying HTTP compression technique.

    Science.gov (United States)

    Malla, Ratnakar

    2008-11-06

    HTTP compression is a technique specified as part of the W3C HTTP 1.0 standard. It allows HTTP servers to take advantage of GZIP compression technology that is built into latest browsers. A brief survey of medical informatics websites show that compression is not enabled. With compression enabled, downloaded files sizes are reduced by more than 50% and typical transaction time is also reduced from 20 to 8 minutes, thus providing a better user experience.

  14. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  15. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  16. Applying the sterile insect technique to the control of insect pests

    International Nuclear Information System (INIS)

    The sterile insect technique involves the mass-rearing of insects, which are sterilized by gamma rays from a 60Co source before being released in a controlled fashion into nature. Matings between the sterile insects released and native insects produce no progeny, and so if enough of these matings occur the pest population can be controlled or even eradicated. A modification of the technique, especially suitable for the suppression of the moths and butterflies, is called the F, or inherited sterility method. In this, lower radiation doses are used such that the released males are only partially sterile (30-60%) and the females are fully sterile. When released males mate with native females some progeny are produced, but they are completely sterile. Thus, full expression of the sterility is delayed by one generation. This article describes the use of the sterile insect technique in controlling the screwworm fly, the tsetse fly, the medfly, the pink bollworm and the melon fly, and of the F1 sterility method in the eradication of local gypsy moth infestations. 18 refs, 5 figs, 1 tab

  17. Quantification of material slippage in the iliotibial tract when applying the partial plastination clamping technique.

    Science.gov (United States)

    Sichting, Freddy; Steinke, Hanno; Wagner, Martin F-X; Fritsch, Sebastian; Hädrich, Carsten; Hammer, Niels

    2015-09-01

    The objective of this study was to evaluate the potential of the partial plastination technique in minimizing material slippage and to discuss the effects on the tensile properties of thin dense connective tissue. The ends of twelve iliotibial tract samples were primed with polyurethane resin and covered by plastic plates to provide sufficient grip between the clamps. The central part of the samples remained in an anatomically unfixed condition. Strain data of twelve partially plastinated samples and ten samples in a completely anatomically unfixed state were obtained using uniaxial crosshead displacement and an optical image tracking technique. Testing of agreement between the strain data revealed ongoing but markedly reduced material slippage in partially plastinated samples compared to the unfixed samples. The mean measurement error introduced by material slippage was up to 18.0% in partially plastinated samples. These findings might complement existing data on measurement errors during material testing and highlight the importance of individual quantitative evaluation of errors that come along with self-made clamping techniques. PMID:26005842

  18. IPR techniques applied to a multimedia environment in the HYPERMEDIA project

    Science.gov (United States)

    Munoz, Alberto; Ribagorda, Arturo; Sierra, Jose M.

    1999-04-01

    Watermarking techniques have been proved as a good method to protect intellectual copyrights in digital formats. But the simplicity for processing information supplied by digital platforms also offers many chances for eliminating marks embedded in the data due to the wide variety of techniques to modify information in digital formats. This paper analyzes a selection of the most interesting methods for image watermarking in order to test its qualities. The comparison of these watermarking techniques has shown new interesting lines of work. Some changes and extensions to these methods are proposed to increase its robustness against some usual attacks and specific watermark attacks. This works has been realized in order to provide the HYPERMEDIA project with an efficient tool for protecting IPR. The objective of this project is to establish an experimental stage on continuous multimedia material (audiovisuals) handling and delivering in a multimedia service environment, allowing the user to navigate in the hyperspace through database which belong to actors of the service chain and protecting IPR of authors or owners.

  19. Applying Data-mining techniques to study drought periods in Spain

    Science.gov (United States)

    Belda, F.; Penades, M. C.

    2010-09-01

    Data-mining is a technique that it can be used to interact with large databases and to help in the discovery relations between parameters by extracting information from massive and multiple data archives. Drought affects many economic and social sectors, from agricultural to transportation, going through urban water deficit and the development of modern industries. With these problems and drought geographical and temporal distribution it's difficult to find a single definition of drought. Improving the understanding of the knowledge of climatic index is necessary to reduce the impacts of drought and to facilitate quick decisions regarding this problem. The main objective is to analyze drought periods from 1950 to 2009 in Spain. We use several kinds of information, different formats, sources and transmission mode. We use satellite-based Vegetation Index, dryness index for several temporal periods. We use daily and monthly precipitation and temperature data and soil moisture data from numerical weather model. We calculate mainly Standardized Precipitation Index (SPI) that it has been used amply in the bibliography. We use OLAP-Mining techniques to discovery of association rules between remote-sensing, numerical weather model and climatic index. Time series Data- Mining techniques organize data as a sequence of events, with each event having a time of recurrence, to cluster the data into groups of records or cluster with similar characteristics. Prior climatological classification is necessary if we want to study drought periods over all Spain.

  20. An efficient permeability scaling-up technique applied to the discretized flow equations

    Energy Technology Data Exchange (ETDEWEB)

    Urgelli, D.; Ding, Yu [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    Grid-block permeability scaling-up for numerical reservoir simulations has been discussed for a long time in the literature. It is now recognized that a full permeability tensor is needed to get an accurate reservoir description at large scale. However, two major difficulties are encountered: (1) grid-block permeability cannot be properly defined because it depends on boundary conditions; (2) discretization of flow equations with a full permeability tensor is not straightforward and little work has been done on this subject. In this paper, we propose a new method, which allows us to get around both difficulties. As the two major problems are closely related, a global approach will preserve the accuracy. So, in the proposed method, the permeability up-scaling technique is integrated in the discretized numerical scheme for flow simulation. The permeability is scaled-up via the transmissibility term, in accordance with the fluid flow calculation in the numerical scheme. A finite-volume scheme is particularly studied, and the transmissibility scaling-up technique for this scheme is presented. Some numerical examples are tested for flow simulation. This new method is compared with some published numerical schemes for full permeability tensor discretization where the full permeability tensor is scaled-up through various techniques. Comparing the results with fine grid simulations shows that the new method is more accurate and more efficient.

  1. A neuro-evolutive technique applied for predicting the liquid crystalline property of some organic compounds

    Science.gov (United States)

    Drăgoi, Elena-Niculina; Curteanu, Silvia; Lisa, Cătălin

    2012-10-01

    A simple self-adaptive version of the differential evolution algorithm was applied for simultaneous architectural and parametric optimization of feed-forward neural networks, used to classify the crystalline liquid property of a series of organic compounds. The developed optimization methodology was called self-adaptive differential evolution neural network (SADE-NN) and has the following characteristics: the base vector used is chosen as the best individual in the current population, two differential terms participate in the mutation process, the crossover type is binomial, a simple self-adaptive mechanism is employed to determine the near-optimal control parameters of the algorithm, and the integration of the neural network into the differential evolution algorithm is performed using a direct encoding scheme. It was found that a network with one hidden layer is able to make accurate predictions, indicating that the proposed methodology is efficient and, owing to its flexibility, it can be applied to a large range of problems.

  2. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  3. A Reinforcement Plate for Partially Thinned Pressure Vessel Designed to Measure the Thickness of Vessel Wall Applying Ultrasonic Technique

    International Nuclear Information System (INIS)

    It is very hard to preserve the wall thickness of the vessel because of the erosion or corrosion as time goes by. Therefore, the wall thicknesses of heaters in power plants are periodically measured using ultrasonic test. If the integrity of the wall thickness is estimated not to secure, the reinforcement plate is welled on the thinned area of the vessel. The overlay weld of the reinforcement plate on the thinned vessel is normally the fillet welding. As shown by the references, the reinforcement plate with adequate thickness does its role very well before the vessel wall is perforated due to thinning. However, the integrity of shell cannot insure because the weldment is directly applied by the shell side pressure to after the vessel wall is perforated. Therefore, it is needed to measure the thickness of thinned area under the reinforcement plate continuously for preserving integrity and planning the fabrication of replacement vessel. It is impossible to apply the ultrasonic thickness measurement technique after the reinforcement plate is welded on the shell. In this paper new reinforcement plate, which makes it possible to measure the wall thickness under the reinforcement plate applying the ultrasonic technique, is introduced. A method to evaluate the structural integrity of a fillet weldment for the reinforcement plate welded on a pressure vessel is introduced in this paper. Moreover, new reinforcement plate, which makes it possible to measure the wall thickness of pressure vessels under the reinforcement plate applying the ultrasonic technique, is introduced

  4. Assessment of ground-based monitoring techniques applied to landslide investigations

    Science.gov (United States)

    Uhlemann, S.; Smith, A.; Chambers, J.; Dixon, N.; Dijkstra, T.; Haslam, E.; Meldrum, P.; Merritt, A.; Gunn, D.; Mackay, J.

    2016-01-01

    A landslide complex in the Whitby Mudstone Formation at Hollin Hill, North Yorkshire, UK is periodically re-activated in response to rainfall-induced pore-water pressure fluctuations. This paper compares long-term measurements (i.e., 2009-2014) obtained from a combination of monitoring techniques that have been employed together for the first time on an active landslide. The results highlight the relative performance of the different techniques, and can provide guidance for researchers and practitioners for selecting and installing appropriate monitoring techniques to assess unstable slopes. Particular attention is given to the spatial and temporal resolutions offered by the different approaches that include: Real Time Kinematic-GPS (RTK-GPS) monitoring of a ground surface marker array, conventional inclinometers, Shape Acceleration Arrays (SAA), tilt meters, active waveguides with Acoustic Emission (AE) monitoring, and piezometers. High spatial resolution information has allowed locating areas of stability and instability across a large slope. This has enabled identification of areas where further monitoring efforts should be focused. High temporal resolution information allowed the capture of 'S'-shaped slope displacement-time behaviour (i.e. phases of slope acceleration, deceleration and stability) in response to elevations in pore-water pressures. This study shows that a well-balanced suite of monitoring techniques that provides high temporal and spatial resolutions on both measurement and slope scale is necessary to fully understand failure and movement mechanisms of slopes. In the case of the Hollin Hill landslide it enabled detailed interpretation of the geomorphological processes governing landslide activity. It highlights the benefit of regularly surveying a network of GPS markers to determine areas for installation of movement monitoring techniques that offer higher resolution both temporally and spatially. The small sensitivity of tilt meter measurements

  5. Psychoanalytic technique and 'analysis terminable and interminable'.

    Science.gov (United States)

    Sandler, J

    1988-01-01

    Some of the implications for psychoanalytic technique of the papers given at the plenary sessions of the Montreal Congress are considered. Emphasis is placed on the role of affects in development and in current psychic functioning. Motivation for unconscious wishes arises from many sources, and affects should not only be thought of as drive derivatives. There is a substantial gap between the (largely) implicit clinico-technical theories in the analytic work presented, which do in fact show great sensitivity to the patients' affects, and the formal 'official' general psychoanalytic theory used. This discrepancy in our theories should be faced. Freud's tripartite structural theory of the mind (the 'second topography') seems now to have limitations for clinical purposes. PMID:3063676

  6. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    Science.gov (United States)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  7. Nuclear fuel lattice performance analysis by data mining techniques

    International Nuclear Information System (INIS)

    Highlights: • This paper shows a data mining application to analyse nuclear fuel lattice designs. • Data mining methods were used to predict if fuel lattices could operate in an adequate way into the BWR reactor core. • Data mining methods learned from fuel lattice datasets simulated with SIMULATE-3. • Results show high recognition percentages of adequate or inadequate fuel lattice performance. - Abstract: In this paper a data mining analysis for BWR nuclear fuel lattice performance is shown. In a typical three-dimensional simulation of the reactor operation simulator gives the core performance for a fuel lattice configuration measured by thermal limits, shutdown margin and produced energy. Based on these results we can determine the number of fulfilled parameters of a fuel lattice configuration. It is interesting to establish a relationship between the fuel lattice properties and the number of fulfilled core parameters in steady state reactor operation. So, with this purpose data mining techniques were used. Results indicate that these techniques are able to predict with enough accuracy (greater than 75%) if a given fuel lattice configuration will have a either “good” or “bad” performance according to reactor core simulation. In this way, they could be coupled with an optimization process to discard fuel lattice configurations with poor performance and, in this way accelerates the optimization process. Data mining techniques apply some filter methods to discard those variables with lower influence in the number of core fulfilled parameter. From this situation, it was also possible to identify a set of variables to be used in new optimization codes with different objective functions than those normally used

  8. Enhanced nonlinear iterative techniques applied to a non-equilibrium plasma flow

    Energy Technology Data Exchange (ETDEWEB)

    Knoll, D.A.; McHugh, P.R. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1996-12-31

    We study the application of enhanced nonlinear iterative methods to the steady-state solution of a system of two-dimensional convection-diffusion-reaction partial differential equations that describe the partially-ionized plasma flow in the boundary layer of a tokamak fusion reactor. This system of equations is characterized by multiple time and spatial scales, and contains highly anisotropic transport coefficients due to a strong imposed magnetic field. We use Newton`s method to linearize the nonlinear system of equations resulting from an implicit, finite volume discretization of the governing partial differential equations, on a staggered Cartesian mesh. The resulting linear systems are neither symmetric nor positive definite, and are poorly conditioned. Preconditioned Krylov iterative techniques are employed to solve these linear systems. We investigate both a modified and a matrix-free Newton-Krylov implementation, with the goal of reducing CPU cost associated with the numerical formation of the Jacobian. A combination of a damped iteration, one-way multigrid and a pseudo-transient continuation technique are used to enhance global nonlinear convergence and CPU efficiency. GMRES is employed as the Krylov method with Incomplete Lower-Upper(ILU) factorization preconditioning. The goal is to construct a combination of nonlinear and linear iterative techniques for this complex physical problem that optimizes trade-offs between robustness, CPU time, memory requirements, and code complexity. It is shown that a one-way multigrid implementation provides significant CPU savings for fine grid calculations. Performance comparisons of the modified Newton-Krylov and matrix-free Newton-Krylov algorithms will be presented.

  9. Evaluation of Bending Strength in Friction Welded Alumina/mild Steel Joints by Applying Factorial Technique

    Science.gov (United States)

    Jesudoss Hynes, N. Rajesh; Nagaraj, P.; Vivek Prabhu, M.

    Joining of metal with ceramics has become significant in many applications, because they combine properties like ductility with high hardness and wear resistance. By friction welding technique, alumina can be joined to mild steel with AA1100 sheet of 1mm thickness as interlayer. In the present work, investigation of the effect of friction time on interlayer thickness reduction and bending strength is carried out by factorial design. By using ANOVA, a statistical tool, regression modeling is done. The regression model predicts the bending strength of welded ceramic/metal joints accurately with ± 2% deviation from the experimental values.

  10. Photon Counting Optical Time Domain Reflectometry Applying a Single Photon Modulation Technique

    Institute of Scientific and Technical Information of China (English)

    WANG Xiao-Bo; WANG Jing-Jing; HE Bo; XIAO Lian-Tuan; JIA Suo-Tang

    2011-01-01

    Photon-counting optical time domain reflectometry (v-OTDR) is typically used in a mode with spatial resolution in the centimeter range.Here we demonstrate a 1550 nm v-OTDR system to optimize the discriminate voltage of a single photon avalanche detector using a single photon modulation and demodulation technique,which shows obvious improvement in the signal intensity.The intensity of signal is doubled when the discriminator voltage is optimized from 184mV to 162mV.

  11. Optimization techniques applied to passive measures for in-orbit spacecraft survivability

    Science.gov (United States)

    Mog, Robert A.; Helba, Michael J.; Hill, Janeil B.

    1992-01-01

    The purpose of this research is to provide Space Station Freedom protective structures design insight through the coupling of design/material requirements, hypervelocity impact phenomenology, meteoroid and space debris environment sensitivities, optimization techniques and operations research strategies, and mission scenarios. The goals of the research are: (1) to develop a Monte Carlo simulation tool which will provide top level insight for Space Station protective structures designers; (2) to develop advanced shielding concepts relevant to Space Station Freedom using unique multiple bumper approaches; and (3) to investigate projectile shape effects on protective structures design.

  12. Nuclear analytical techniques applied to characterization of atmospheric aerosols in Amazon Region

    International Nuclear Information System (INIS)

    This work presents the atmospheric aerosols characterization that exist in different regions of Amazon basin. The biogenic aerosol emission by forest, as well as the atmospheric emissions of particulate materials due to biomass burning, were analyzed. Samples of aerosol particles were collected during three years in two different locations of Amazon region using Stacked Unit Filters. In order to study these samples some analytical nuclear techniques were used. The high concentrations of aerosols as a result of biomass burning process were observed in the period of june-september

  13. Applying Intelligent Computing Techniques to Modeling Biological Networks from Expression Data

    Institute of Scientific and Technical Information of China (English)

    Wei-Po Lee; Kung-Cheng Yang

    2008-01-01

    Constructing biological networks is one of the most important issues in system sbiology. However, constructing a network from data manually takes a considerable large amount of time, therefore an automated procedure is advocated. To automate the procedure of network construction, in this work we use two intelligent computing techniques, genetic programming and neural computation, to infer two kinds of network models that use continuous variables. To verify the presented approaches, experiments have been conducted and the preliminary results show that both approaches can be used to infer networks successfully.

  14. Accelerator based techniques for aerosol analysis

    International Nuclear Information System (INIS)

    At the 3 MV Tandetron accelerator of the LABEC laboratory of INFN (Florence, Italy) an external beam facility is fully dedicated to PIXE-PIGE measurements of elemental composition of atmospheric aerosols. Examples regarding recent monitoring campaigns, performed in urban and remote areas, both on a daily basis and with high time resolution, as well as with size selection, will be presented. It will be evidenced how PIXE can provide unique information in aerosol studies or can play a complementary role to traditional chemical analysis. Finally a short presentation of 14C analysis of the atmospheric aerosol by Accelerator Mass Spectrometry (AMS) for the evaluation of the contributions from either fossil fuel combustion or modern sources (wood burning, biogenic activity) will be given. (author)

  15. Time-of-arrival analysis applied to ELF/VLF wave generation experiments at HAARP

    Science.gov (United States)

    Moore, R. C.; Fujimaru, S.

    2012-12-01

    Time-of-arrival (TOA) analysis is applied to observations performed during ELF/VLF wave generation experiments at the High-frequency Active Auroral Research Program (HAARP) HF transmitter in Gakona, Alaska. In 2012, a variety of ELF/VLF wave generation techniques were employed to identify the dominant source altitude for each case. Observations were performed for beat-wave modulation, AM modulation, STF modulation, ICD modulation, and cubic frequency modulation, among others. For each of these cases, we identify the dominant ELF/VLF source altitude and compare the experimental results with theoretical HF heating predictions.

  16. Advantages and Drawbacks of Applying Periodic Time-Variant Modal Analysis to Spur Gear Dynamics

    DEFF Research Database (Denmark)

    Pedersen, Rune; Santos, Ilmar; Hede, Ivan Arthur

    2010-01-01

    A simplified torsional model with a reduced number of degrees-of-freedom is used in order to investigate the potential of the technique. A time-dependent gear mesh stiffness function is introduced and expanded in a Fourier series. The necessary number of Fourier terms is determined in order...... to ensure sufficient accuracy of the results. The method of time-variant modal analysis is applied, and the changes in the fundamental and the parametric resonance frequencies as a function of the rotational speed of the gears, are found. By obtaining the stationary and parametric parts of the time...

  17. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  18. DNA ANALYSIS OF RICIN USING RAPD TECHNIQUE

    OpenAIRE

    Martin Vivodík; Želmíra Balážová; Zdenka Gálová

    2014-01-01

    Castor (Ricinus communis L.) is an important plant for production of industrial oil. The systematic evaluation of the molecular diversity encompassed in castor inbreds or parental lines offers an efficient means of exploiting the heterosis in castor as well as for management of biodiversity. The aim of this work was to detect genetic variability among the set of 30 castor genotypes using 5 RAPD markers. Amplification of genomic DNA of 30 genotypes, using RAPD analysis, yielded 35 fragments, w...

  19. ANALYSIS AND COMPARATIVE STUDY OF SEARCHING TECHNIQUES

    OpenAIRE

    Yuvraj Singh Chandrawat*

    2015-01-01

    We live in the age of technolgy and it is quiet obvious that it is increasing day-by-day endlessly. In this technical era researchers are focusing on the development of the existing technologies. Software engineering is the dominant branch of Computer Science that deals with the development and analysis of the software. The objective of this study is to analyze and compare the existing searching algorithms (linear search and binary search). In this paper, we will discuss both thes...

  20. Applying stakeholder Delphi techniques for planning sustainable use of aquatic resources

    DEFF Research Database (Denmark)

    Lund, Søren; Banta, Gary Thomas; Bunting, Stuart W

    2015-01-01

    and Vietnam. The purpose of this paper is to give an account of how the stakeholder Delphi method was adapted and applied to support the participatory integrated action planning for sustainable use of aquatic resources facilitated within the HighARCS project. An account of the steps taken and results recorded......The HighARCS (Highland Aquatic Resources Conservation and Sustainable Development) project was a participatory research effort to map and better understand the patterns of resource use and livelihoods of communities who utilize highland aquatic resources in five sites across China, India...

  1. Fragrance composition of Dendrophylax lindenii (Orchidaceae using a novel technique applied in situ

    Directory of Open Access Journals (Sweden)

    James J. Sadler

    2012-02-01

    Full Text Available The ghost orchid, Dendrophylax lindenii (Lindley Bentham ex Rolfe (Orchidaceae, is one of North America’s rarest and well-known orchids. Native to Cuba and SW Florida where it frequents shaded swamps as an epiphyte, the species has experienced steady decline. Little information exists on D. lindenii’s biology in situ, raising conservation concerns. During the summer of 2009 at an undisclosed population in Collier County, FL, a substantial number (ca. 13 of plants initiated anthesis offering a unique opportunity to study this species in situ. We report a new technique aimed at capturing floral headspace of D. lindenii in situ, and identified volatile compounds using gas chromatography mass spectrometry (GC/MS. All components of the floral scent were identified as terpenoids with the exception of methyl salicylate. The most abundant compound was the sesquiterpene (E,E-α-farnesene (71% followed by (E-β-ocimene (9% and methyl salicylate (8%. Other compounds were: linalool (5%, sabinene (4%, (E-α-bergamotene (2%, α-pinene (1%, and 3-carene (1%. Interestingly, (E,E-α-farnesene has previously been associated with pestiferous insects (e.g., Hemiptera. The other compounds are common floral scent constituents in other angiosperms suggesting that our in situ technique was effective. Volatile capture was, therefore, possible without imposing physical harm (e.g., inflorescence detachment to this rare orchid.

  2. A New Astrometric Technique Applied to the Likely Tidal Disruption Event, Swift J166+57

    Science.gov (United States)

    Alianora Hounsell, Rebekah; Fruchter, Andrew S.; Levan, Andrew J.

    2015-01-01

    We have developed a new technique to align Hubble Space Telescope (HST) data using background galaxies as astrometric markers. This technique involves the cross correlation of cutouts of regions about individual galaxies from different epochs, enabling the determination of an astrometric solution. The method avoids errors introduced by proper motion when the locations of stars are used to transform the images. We have used this approach to investigate the nature of the unusual gamma-ray source Sw J1644+57, which was initially classified as a long gamma ray burst (LGRB). However, due to the object's atypical behavior in the X-ray and optical, along with its location within the host (150 ± 150 pc, see Levan et al. 2011) it has been suggested that the transient may be caused by a tidal disruption event (TDE). Additional theories have also been suggested for its origin which remain based on the collapsar model for a long burst, such as the collapse of a red giant, rather than a stripped star as is typical in LGRBs, or the creation of a magnetar.Precise astrometry of the transient with respect to the galaxy can potentially distinguish between these scenarios. Here we show that our method of alignment dramatically reduces the astrometric error of the position of the transient with respect to the nucleus of the host. We therefore discuss the implication of our result on the astrophysical nature of the object.

  3. Therapeutic techniques applied in the heavy-ion therapy at IMP

    Science.gov (United States)

    Li, Qiang; Sihver, Lembit

    2011-04-01

    Superficially-placed tumors have been treated with carbon ions at the Institute of Modern Physics (IMP), Chinese Academy of Sciences (CAS), since November 2006. Up to now, 103 patients have been irradiated in the therapy terminal of the heavy ion research facility in Lanzhou (HIRFL) at IMP, where carbon-ion beams with energies up to 100 MeV/u can be supplied and a passive beam delivery system has been developed and commissioned. A number of therapeutic and clinical experiences concerning heavy-ion therapy have been acquired at IMP. To extend the heavy-ion therapy project to deep-seated tumor treatment, a horizontal beam line dedicated to this has been constructed in the cooling storage ring (CSR), which is a synchrotron connected to the HIRFL as an injector, and is now in operation. Therapeutic high-energy carbon-ion beams, extracted from the HIRFL-CSR through slow extraction techniques, have been supplied in the deep-seated tumor therapy terminal. After the beam delivery, shaping and monitoring devices installed in the therapy terminal at HIRFL-CSR were validated through therapeutic beam tests, deep-seated tumor treatment with high-energy carbon ions started in March 2009. The therapeutic techniques in terms of beam delivery system, conformal irradiation method and treatment planning used at IMP are introduced in this paper.

  4. Applying Reflective Middleware Techniques to Optimize a QoS-enabled CORBA Component Model Implementation

    Science.gov (United States)

    Wang, Nanbor; Parameswaran, Kirthika; Kircher, Michael; Schmidt, Douglas

    2003-01-01

    Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and open sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration framework for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of-service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines rejective middleware techniques designed to adaptively (1) select optimal communication mechanisms, (2) manage QoS properties of CORBA components in their contain- ers, and (3) (re)con$gure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of rejective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.

  5. High-level power analysis and optimization techniques

    Science.gov (United States)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  6. Applying Tiab’s direct synthesis technique to dilatant non-Newtonian/Newtonian fluids

    Directory of Open Access Journals (Sweden)

    Javier Andrés Martínez

    2011-08-01

    Full Text Available Non-Newtonian fluids, such as polymer solutions, have been used by the oil industry for many years as fracturing agents and drilling mud. These solutions, which normally include thickened water and jelled fluids, are injected into the formation to enhanced oil recovery by improving sweep efficiency. It is worth noting that some heavy oils behave non-Newtonianly. Non-Newtonian fluids do not have direct proportionality between applied shear stress and shear rate and viscosity varies with shear rate depending on whether the fluid is either pseudoplastic or dilatant. Viscosity decreases as shear rate increases for the former whilst the reverse takes place for dilatants. Mathematical models of conventional fluids thus fail when applied to non-Newtonian fluids. The pressure derivative curve is introduced in this descriptive work for a dilatant fluid and its pattern was observed. Tiab’s direct synthesis (TDS methodology was used as a tool for interpreting pressure transient data to estimate effective permeability, skin factors and non-Newtonian bank radius. The methodology was successfully verified by its application to synthetic examples. Also, comparing it to pseudoplastic behavior, it was found that the radial flow regime in the Newtonian zone of dilatant fluids took longer to form regarding both the flow behavior index and consistency factor.

  7. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  8. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  9. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Matthew W. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include the inherently weak Raman cross section and susceptibility to fluorescence interference.

  10. Mass Movement Hazards in the Mediterranean; A review on applied techniques and methodologies

    Science.gov (United States)

    Ziade, R.; Abdallah, C.; Baghdadi, N.

    2012-04-01

    Emergent population and expansions of settlements and life-lines over hazardous areas in the Mediterranean region have largely increased the impact of Mass Movements (MM) both in industrialized and developing countries. This trend is expected to continue in the next decades due to increased urbanization and development, continued deforestation and increased regional precipitation in MM-prone areas due to changing climatic patterns. Consequently, and over the past few years, monitoring of MM has acquired great importance from the scientific community as well as the civilian one. This article begins with a discussion of the MM classification, and the different topographic, geologic, hydrologic and environmental impacting factors. The intrinsic (preconditioning) variables determine the susceptibility of MM and extrinsic factors (triggering) can induce the probability of MM occurrence. The evolution of slope instability studies is charted from geodetic or observational techniques, to geotechnical field-based origins to recent higher levels of data acquisition through Remote Sensing (RS) and Geographic Information System (GIS) techniques. Since MM detection and zoning is difficult in remote areas, RS and GIS have enabled regional studies to predominate over site-based ones where they provide multi-temporal images hence facilitate greatly MM monitoring. The unusual extent of the spectrum of MM makes it difficult to define a single methodology to establish MM hazard. Since the probability of occurrence of MM is one of the key components in making rational decisions for management of MM risk, scientists and engineers have developed physical parameters, equations and environmental process models that can be used as assessment tools for management, education, planning and legislative purposes. Assessment of MM is attained through various modeling approaches mainly divided into three main sections: quantitative/Heuristic (1:2.000-1:10.000), semi-quantitative/Statistical (1

  11. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  12. New trends in applied harmonic analysis sparse representations, compressed sensing, and multifractal analysis

    CERN Document Server

    Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula

    2016-01-01

    This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.

  13. An Encoding Technique for Multiobjective Evolutionary Algorithms Applied to Power Distribution System Reconfiguration

    Directory of Open Access Journals (Sweden)

    J. L. Guardado

    2014-01-01

    Full Text Available Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2 and the Nondominated Sorting Genetic Algorithm II (NSGA-II. The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  14. Automatic diameter control system applied to the laser heated pedestal growth technique

    Directory of Open Access Journals (Sweden)

    Andreeta M.R.B.

    2003-01-01

    Full Text Available We described an automatic diameter control system (ADC, for the laser heated pedestal growth technique, that reduces the diameter fluctuations in oxide fibers grown from unreacted and non-sinterized pedestals, to less than 2% of the average fiber diameter, and diminishes the average diameter fluctuation, over the entire length of the fiber, to less than 1%. The ADC apparatus is based on an artificial vision system that controls the pulling speed and the height of the molten zone within a precision of 30 mum. We also show that this system can be used for periodic in situ axial doping the fiber. Pure and Cr3+ doped LaAlO3 and pure LiNbO3 were usedas model materials.

  15. Models of signal validation using artificial intelligence techniques applied to a nuclear reactor

    International Nuclear Information System (INIS)

    This work presents two models of signal validation in which the analytical redundancy of the monitored signals from a nuclear plant is made by neural networks. In one model the analytical redundancy is made by only one neural network while in the other it is done by several neural networks, each one working in a specific part of the entire operation region of the plant. Four cluster techniques were tested to separate the entire operation region in several specific regions. An additional information of systems' reliability is supplied by a fuzzy inference system. The models were implemented in C language and tested with signals acquired from Angra I nuclear power plant, from its start to 100% of power. (author)

  16. GPU peer-to-peer techniques applied to a cluster interconnect

    CERN Document Server

    Ammendola, Roberto; Biagioni, Andrea; Bisson, Mauro; Fatica, Massimiliano; Frezza, Ottorino; Cicero, Francesca Lo; Lonardo, Alessandro; Mastrostefano, Enrico; Paolucci, Pier Stanislao; Rossetti, Davide; Simula, Francesco; Tosoratto, Laura; Vicini, Piero

    2013-01-01

    Modern GPUs support special protocols to exchange data directly across the PCI Express bus. While these protocols could be used to reduce GPU data transmission times, basically by avoiding staging to host memory, they require specific hardware features which are not available on current generation network adapters. In this paper we describe the architectural modifications required to implement peer-to-peer access to NVIDIA Fermi- and Kepler-class GPUs on an FPGA-based cluster interconnect. Besides, the current software implementation, which integrates this feature by minimally extending the RDMA programming model, is discussed, as well as some issues raised while employing it in a higher level API like MPI. Finally, the current limits of the technique are studied by analyzing the performance improvements on low-level benchmarks and on two GPU-accelerated applications, showing when and how they seem to benefit from the GPU peer-to-peer method.

  17. Acoustic emission partial discharge detection technique applied to fault diagnosis: Case studies of generator transformers

    Directory of Open Access Journals (Sweden)

    Shanker Tangella Bhavani

    2016-01-01

    Full Text Available In power transformers, locating the partial discharge (PD source is as important as identifying it. Acoustic Emission (AE sensing offers a good solution for both PD detection and PD source location identification. In this paper the principle of the AE technique, along with in-situ findings of the online acoustic emission signals captured from partial discharges on a number of Generator Transformers (GT, is discussed. Of the two cases discussed, the first deals with Acoustic Emission Partial Discharge (AEPD tests on two identical transformers, and the second deals with the AEPD measurement of a transformer carried out on different occasions (years. These transformers are from a hydropower station and a thermal power station in India. Tests conducted in identical transformers give the provision for comparing AE signal amplitudes from the two transformers. These case studies also help in comprehending the efficacy of integrating Dissolved Gas is (DGA data with AEPD test results in detecting and locating the PD source.

  18. Emerging and Innovative Techniques for Arsenic Removal Applied to a Small Water Supply System

    Directory of Open Access Journals (Sweden)

    António J. Alçada

    2009-12-01

    Full Text Available The impact of arsenic on human health has led its drinking water MCL to be drastically reduced from 50 to 10 ppb. Consequently, arsenic levels in many water supply sources have become critical. This has resulted in technical and operational impacts on many drinking water treatment plants that have required onerous upgrading to meet the new standard. This becomes a very sensitive issue in the context of water scarcity and climate change, given the expected increasing demand on groundwater sources. This work presents a case study that describes the development of low-cost techniques for efficient arsenic control in drinking water. The results obtained at the Manteigas WTP (Portugal demonstrate the successful implementation of an effective and flexible process of reactive filtration using iron oxide. At real-scale, very high removal efficiencies of over 95% were obtained.

  19. Applying the sterile insect technique to the control of insect pests

    International Nuclear Information System (INIS)

    The sterile insect technique (SIT) is basically a novel twentieth century approach to insect birth control. It is species specific and exploits the mate seeking behaviour of the insect. The basic principle is simple. Insects are mass reared in 'factories' and sexually sterilized by gamma rays from a 60Co source. The sterile insects are then released in a controlled fashion into nature. Matings between the sterile insects released and native insects produced no progeny. If enough of these matings take place, reproduction of the pest population decreases. With continued release, the pest population can be controlled and in some cases eradicated. In the light of the many important applications of the SIT worldwide and the great potential that SIT concepts hold for insect and pest control in developing countries, two special benefits should be stressed. Of greatest significance is the fact that the SIT permits suppression and eradication of insect pests in an environmentally harmless manner. It combines nuclear techniques with genetic approaches and, in effect, replaces intensive use of chemicals in pest control. Although chemicals are used sparingly at the outset in some SIT programmes to reduce the size of the pest population before releases of sterilized insects are started, the total amount of chemicals used in an SIT programme is a mere fraction of what would be used without the SIT. It is also of great importance that the SIT is not designed strictly for the eradication of pest species but can readily be used in the suppression of insect populations. In fact, the SIT is ideally suited for use in conjunction with other agricultural pest control practices such as the use of parasites and predators, attractants and cultural controls (e.g. ploughing under or destruction of crop residues) in integrated pest management programmes to achieve control at the lowest possible price and with a minimum of chemical contamination of the environment

  20. Contrast cancellation technique applied to digital x-ray imaging using silicon strip detectors

    International Nuclear Information System (INIS)

    Dual-energy mammographic imaging experimental tests have been performed using a compact dichromatic imaging system based on a conventional x-ray tube, a mosaic crystal, and a 384-strip silicon detector equipped with full-custom electronics with single photon counting capability. For simulating mammal tissue, a three-component phantom, made of Plexiglass, polyethylene, and water, has been used. Images have been collected with three different pairs of x-ray energies: 16-32 keV, 18-36 keV, and 20-40 keV. A Monte Carlo simulation of the experiment has also been carried out using the MCNP-4C transport code. The Alvarez-Macovski algorithm has been applied both to experimental and simulated data to remove the contrast between two of the phantom materials so as to enhance the visibility of the third one

  1. Research on Key Techniques for Video Surveillance System Applied to Shipping Channel Management

    Institute of Scientific and Technical Information of China (English)

    WANG Lin; ZHUANG Yan-bin; ZHENG Cheng-zeng

    2007-01-01

    A video patrol and inspection system is an important part of the government's shipping channel information management. This system is mainly applied to video information gathering and processing as a patrol is carried out. The system described in this paper can preview, edit, and add essential explanation messages to the collected video data. It then transfers these data and messages to a video server for the leaders and engineering and technical personnel to retrieve, play, chart, download or print. Each department of the government will use the system's functions according to that department's mission. The system can provide an effective means for managing the shipping enterprise. It also provides a valuable reference for the modernizing of waterborne shipping.

  2. Linear and Non-Linear Control Techniques Applied to Actively Lubricated Journal Bearings

    DEFF Research Database (Denmark)

    Nicoletti, Rodrigo; Santos, Ilmar

    2003-01-01

    to a tilting-pad journal bearing, are analysed and discussed. Important conclusions about the application of integral controllers, responsible for changing the rotor-bearing equilibrium position and consequently the "passive" oil film damping coefficients, are achieved. Numerical results show an effective......The main objectives of actively lubricated bearings are the simultaneous reduction of wear and vibration between rotating and stationary machinery parts. For reducing wear and dissipating vibration energy until certain limits, one can count with the conventional hydrodynamic lubrication....... For further reduction of shaft vibrations one can count with the active lubrication action, which is based on injecting pressurised oil into the bearing gap through orifices machined in the bearing sliding surface. The design and efficiency of some linear (PD, PI and PID) and non-linear controllers, applied...

  3. Data Mining Techniques: A Source for Consumer Behavior Analysis

    OpenAIRE

    Abhijit Raorane; R.V. Kulkarni

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply...

  4. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  5. Complementary testing techniques applied to obtain the freeze-thaw resistance of concrete

    OpenAIRE

    Romero, H. L.; Enfedaque, A.; Gálvez, J. C.; Casati, M. J.

    2015-01-01

    Most of the standards that evaluate the resistance of concrete against freeze-thaw cycles (FTC) are based on the loss of weight due to scaling. Such procedures are useful but do not provide information about the microstructural deterioration of the concrete. The test procedure needs to be stopped after several FTCs for weighing the loss of material by scaling. This paper proposes the use of mercury-intrusion-porosimetry and thermogravimetric analysis for assessing the microstructural damage o...

  6. Can Artificial Neural Networks be Applied in Seismic Predicition? Preliminary Analysis Applying Radial Topology. Case: Mexico

    CERN Document Server

    Mota-Hernandez, Cinthya; Alvarado-Corona, Rafael

    2014-01-01

    Tectonic earthquakes of high magnitude can cause considerable losses in terms of human lives, economic and infrastructure, among others. According to an evaluation published by the U.S. Geological Survey, 30 is the number of earthquakes which have greatly impacted Mexico from the end of the XIX century to this one. Based upon data from the National Seismological Service, on the period between January 1, 2006 and May 1, 2013 there have occurred 5,826 earthquakes which magnitude has been greater than 4.0 degrees on the Richter magnitude scale (25.54% of the total of earthquakes registered on the national territory), being the Pacific Plate and the Cocos Plate the most important ones. This document describes the development of an Artificial Neural Network (ANN) based on the radial topology which seeks to generate a prediction with an error margin lower than 20% which can inform about the probability of a future earthquake one of the main questions is: can artificial neural networks be applied in seismic forecast...

  7. THE RESEARCH TECHNIQUES FOR ANALYSIS OF MECHANICAL AND TRIBOLOGICAL PROPERTIES OF COATING-SUBSTRATE SYSTEMS

    OpenAIRE

    Kinga CHRONOWSKA-PRZYWARA; Marcin KOT; Sławomir ZIMOWSKI

    2014-01-01

    The article presents research techniques for the analysis of both mechanical and tribological properties of thin coatings applied on highly loaded machine elements. In the Institute of Machine Design and Exploitation, AGH University of Science and Technology students of the second level of Mechanical Engineering study tribology attending laboratory class. Students learn on techniques for mechanical and tribological testing of thin, hard coatings deposited by PVD and CVD technologies. The prog...

  8. Application of slip-band visualization technique to tensile analysis of laser-welded aluminum alloy

    OpenAIRE

    Muchiar, Ir.; Yoshida, S.; Widiastuti, R.; Kusnovo, A.; Takahashi, K; Sato, S.

    1996-01-01

    Recently we have developed a new optical interferometric technique capable of visualizing slip band occurring in a deforming solid-state object. In this work we applied this technique to a tensile analysis of laser-welded aluminum plate samples, and successfully revealed stress concentration that shows strong relationships with the tensile strength and the fracture mechanism. We believe that this method is a new, convenient way to analyze the deformation characteristics of welded objects and ...

  9. Analysis and calibration techniques for superconducting resonators

    Science.gov (United States)

    Cataldo, Giuseppe; Wollack, Edward J.; Barrentine, Emily M.; Brown, Ari D.; Moseley, S. Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  10. Use of pesticides and experience of applying radioisotope techniques in a developing country

    International Nuclear Information System (INIS)

    An evaluation is made of the use of pesticides by Panamanian farmers in a tropical environment, also covering pesticide residues in plant and animal products, man and soil. In addition, experience with radioisotope techniques is described. Chemical control is common practice among farmers. Each year, 5000 to 6000 t of pesticides are used, especially in horticulture and banana cultivation. Herbicides and insecticides predominate in terms of quantity, and fungicides in terms of frequency of application. Use of the so-called persistent organo-chlorines over the last few decades has led to the presence of residues in plant and animal products in amounts less than 2.2 and 0.1 mg/kg for DDT and lindane, respectively. An average of 11 mg of DDT per kilogram of fat has been detected in the population; about 50% of the persons handling agrochemicals showed direct exposure. Taking into account local practices and tropical conditions, an evaluation is being made of widely used pesticides (maneb, paraquat and 2,4-D) labelled with 14C. The studies have yielded additional information on the behaviour and the residues of these important additives in the environment and in fruits. (author). 3 refs, 1 fig., 5 tabs

  11. Super-ensemble techniques applied to wave forecast: performance and limitations

    Directory of Open Access Journals (Sweden)

    F. Lenartz

    2010-06-01

    Full Text Available Nowadays, several operational ocean wave forecasts are available for a same region. These predictions may considerably differ, and to choose the best one is generally a difficult task. The super-ensemble approach, which consists in merging different forecasts and past observations into a single multi-model prediction system, is evaluated in this study. During the DART06 campaigns organized by the NATO Undersea Research Centre, four wave forecasting systems were simultaneously run in the Adriatic Sea, and significant wave height was measured at six stations as well as along the tracks of two remote sensors. This effort provided the necessary data set to compare the skills of various multi-model combination techniques. Our results indicate that a super-ensemble based on the Kalman Filter improves the forecast skills: The bias during both the hindcast and forecast periods is reduced, and the correlation coefficient is similar to that of the best individual model. The spatial extrapolation of local results is not straightforward and requires further investigation to be properly implemented.

  12. Applying satellite remote sensing technique in disastrous rainfall systems around Taiwan

    Science.gov (United States)

    Liu, Gin-Rong; Chen, Kwan-Ru; Kuo, Tsung-Hua; Liu, Chian-Yi; Lin, Tang-Huang; Chen, Liang-De

    2016-05-01

    Many people in Asia regions have been suffering from disastrous rainfalls year by year. The rainfall from typhoons or tropical cyclones (TCs) is one of their key water supply sources, but from another perspective such TCs may also bring forth unexpected heavy rainfall, thereby causing flash floods, mudslides or other disasters. So far we cannot stop or change a TC route or intensity via present techniques. Instead, however we could significantly mitigate the possible heavy casualties and economic losses if we can earlier know a TC's formation and can estimate its rainfall amount and distribution more accurate before its landfalling. In light of these problems, this short article presents methods to detect a TC's formation as earlier and to delineate its rainfall potential pattern more accurate in advance. For this first part, the satellite-retrieved air-sea parameters are obtained and used to estimate the thermal and dynamic energy fields and variation over open oceans to delineate the high-possibility typhoon occurring ocean areas and cloud clusters. For the second part, an improved tropical rainfall potential (TRaP) model is proposed with better assumptions then the original TRaP for TC rainfall band rotations, rainfall amount estimation, and topographic effect correction, to obtain more accurate TC rainfall distributions, especially for hilly and mountainous areas, such as Taiwan.

  13. Applying stereotactic injection technique to study genetic effects on animal behaviors.

    Science.gov (United States)

    McSweeney, Colleen; Mao, Yingwei

    2015-05-10

    Stereotactic injection is a useful technique to deliver high titer lentiviruses to targeted brain areas in mice. Lentiviruses can either overexpress or knockdown gene expression in a relatively focused region without significant damage to the brain tissue. After recovery, the injected mouse can be tested on various behavioral tasks such as the Open Field Test (OFT) and the Forced Swim Test (FST). The OFT is designed to assess locomotion and the anxious phenotype in mice by measuring the amount of time that a mouse spends in the center of a novel open field. A more anxious mouse will spend significantly less time in the center of the novel field compared to controls. The FST assesses the anti-depressive phenotype by quantifying the amount of time that mice spend immobile when placed into a bucket of water. A mouse with an anti-depressive phenotype will spend significantly less time immobile compared to control animals. The goal of this protocol is to use the stereotactic injection of a lentivirus in conjunction with behavioral tests to assess how genetic factors modulate animal behaviors.

  14. Dosimetric properties of bio minerals applied to high-dose dosimetry using the TSEE technique

    Energy Technology Data Exchange (ETDEWEB)

    Vila, G. B.; Caldas, L. V. E., E-mail: gbvila@ipen.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    The study of the dosimetric properties such as reproducibility, the residual signal, lower detection dose, dose-response curve and fading of the thermally stimulated emission exo electronic (TSEE) signal of Brazilian bio minerals has shown that these materials present a potential use as radiation dosimeters. The reproducibility within ± 10% for oyster shell, mother-of-pearl and coral reef samples showed that the signal dispersion is small when compared with the mean value of the measurements. The study showed that the residual signal can be eliminated with a thermal treatment at 300 grades C/1 h. The lower detection dose of 9.8 Gy determined for the oyster shell samples when exposed to beta radiation and 1.6 Gy for oyster shell and mother-of-pearl samples when exposed to gamma radiation can be considered good, taking into account the high doses of this study. The materials presented linearity at the dose response curves in some ranges, but the lack of linearity in other cases presents no problem since a good mathematical description is possible. The fading study showed that the loss of TSEE signal can be minimized if the samples are protected from interferences such as light, heat and humidity. Taking into account the useful linearity range as the main dosimetric characteristic, the tiger shell and oyster shell samples are the most suitable for high-dose dosimetry using the TSEE technique. (Author)

  15. COMPARATIVE PERFORMANCE MONITORING OF RAINFED WATERSHEDS APPLYING GIS AND RS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    ARUN W. DHAWALE

    2012-03-01

    Full Text Available Under the watershed development project of the Ministry of Rural Development, many micro watersheds have been identified for development and management. However Government is handicapped inobtaining data on the performance of these programmes due to the absence of watershed performance studies. Rainfed agriculture is clearly critical to agricultural performance in India. Nonetheless, it is difficult to precisely quantify the overall importance of the sector. The widely quoted statistic is that 70% of cultivated area israinfed, implying that rainfed agriculture is more important than irrigated agriculture. In the present study two rainfed micro-watersheds namely Kolvan valley and Darewadi is taken as case study for performance monitoring using GIS and RS Techniques. An attempt has been made to highlight the role of GIS and RS in estimation of runoff from both the watersheds by SCS curve number method. The methodology developed for the research show that the knowledge extracted from proposed approach can remove the problem of performance monitoring of micro watersheds to great extent. Comparative performance of both micro watersheds having extreme rainfall conditions shows that in Darewadi micro watershed overall success rate is more than Kolvan valley.

  16. Data smoothing techniques applied to proton microprobe scans of teleost hard parts

    International Nuclear Information System (INIS)

    We use a proton microprobe to examine the distribution of elements in otoliths and scales of teleost (bony) fish. The elements of principal interest are calcium and strontium in otoliths and calcium and fluorine in scales. Changes in the distribution of these elements across hard structures may allow inferences about the life histories of fish. Otoliths and scales of interest are up to a centimeter in linear dimension and to reveal the structures of interest up to 200 sampling points are required in each dimension. The time needed to accumulate high X-ray counts at each sampling point can be large, particularly for strontium. To reduce microprobe usage we use data smoothing techniques to reveal changing patterns with modest X-ray count accumulations at individual data points. In this paper we review performance for revealing pattern at modest levels of X-ray count accumulations of a selection of digital filters (moving average smoothers), running median filters, robust locally weighted regression filters and adaptive spline filters. (author)

  17. Experimental studies of active and passive flow control techniques applied in a twin air-intake.

    Science.gov (United States)

    Paul, Akshoy Ranjan; Joshi, Shrey; Jindal, Aman; Maurya, Shivam P; Jain, Anuj

    2013-01-01

    The flow control in twin air-intakes is necessary to improve the performance characteristics, since the flow traveling through curved and diffused paths becomes complex, especially after merging. The paper presents a comparison between two well-known techniques of flow control: active and passive. It presents an effective design of a vortex generator jet (VGJ) and a vane-type passive vortex generator (VG) and uses them in twin air-intake duct in different combinations to establish their effectiveness in improving the performance characteristics. The VGJ is designed to insert flow from side wall at pitch angle of 90 degrees and 45 degrees. Corotating (parallel) and counterrotating (V-shape) are the configuration of vane type VG. It is observed that VGJ has the potential to change the flow pattern drastically as compared to vane-type VG. While the VGJ is directed perpendicular to the side walls of the air-intake at a pitch angle of 90 degree, static pressure recovery is increased by 7.8% and total pressure loss is reduced by 40.7%, which is the best among all other cases tested for VGJ. For bigger-sized VG attached to the side walls of the air-intake, static pressure recovery is increased by 5.3%, but total pressure loss is reduced by only 4.5% as compared to all other cases of VG.

  18. Correlation Techniques as Applied to Pose Estimation in Space Station Docking

    Science.gov (United States)

    Rollins, J. Michael; Juday, Richard D.; Monroe, Stanley E., Jr.

    2002-01-01

    The telerobotic assembly of space-station components has become the method of choice for the International Space Station (ISS) because it offers a safe alternative to the more hazardous option of space walks. The disadvantage of telerobotic assembly is that it does not provide for direct arbitrary views of mating interfaces for the teleoperator. Unless cameras are present very close to the interface positions, such views must be generated graphically, based on calculated pose relationships derived from images. To assist in this photogrammetric pose estimation, circular targets, or spots, of high contrast have been affixed on each connecting module at carefully surveyed positions. The appearance of a subset of spots essentially must form a constellation of specific relative positions in the incoming digital image stream in order for the docking to proceed. Spot positions are expressed in terms of their apparent centroids in an image. The precision of centroid estimation is required to be as fine as 1I20th pixel, in some cases. This paper presents an approach to spot centroid estimation using cross correlation between spot images and synthetic spot models of precise centration. Techniques for obtaining sub-pixel accuracy and for shadow, obscuration and lighting irregularity compensation are discussed.

  19. Experiences in applying optimization techniques to configurations for the Control of Flexible Structures (COFS) program

    Science.gov (United States)

    Walsh, Joanne L.

    1989-01-01

    Optimization procedures are developed to systematically provide closely-spaced vibration frequencies. A general purpose finite-element program for eigenvalue and sensitivity analyses is combined with formal mathematical programming techniques. Results are presented for three studies. The first study uses a simple model to obtain a design with two pairs of closely-spaced frequencies. Two formulations are developed: an objective function-based formulation and constraint-based formulation for the frequency spacing. It is found that conflicting goals are handled better by a constraint-based formulation. The second study uses a detailed model to obtain a design with one pair of closely-spaced frequencies while satisfying requirements on local member frequencies and manufacturing tolerances. Two formulations are developed. Both the constraint-based and the objective function-based formulations perform reasonably well and converge to the same results. However, no feasible design solution exists which satisfies all design requirements for the choices of design variables and the upper and lower design variable values used. More design freedom is needed to achieve a fully satisfactory design. The third study is part of a redesign activity in which a detailed model is used.

  20. Applying advanced imaging techniques to a murine model of orthotopic osteosarcoma

    Directory of Open Access Journals (Sweden)

    Matthew Lawrence Broadhead

    2015-08-01

    Full Text Available IntroductionReliable animal models are required to evaluate novel treatments for osteosarcoma. In this study, the aim was to implement advanced imaging techniques in a murine model of orthotopic osteosarcoma to improve disease modeling and the assessment of primary and metastatic disease.Materials and methodsIntra-tibial injection of luciferase-tagged OPGR80 murine osteosarcoma cells was performed in Balb/c nude mice. Treatment agent (pigment epithelium-derived factor; PEDF was delivered to the peritoneal cavity. Primary tumors and metastases were evaluated by in vivo bioluminescent assays, micro-computed tomography, [18F]-Fluoride-PET and [18F]-FDG-PET. Results[18F]-Fluoride-PET was more sensitive than [18F]-FDG-PET for detecting early disease. Both [18F]-Fluoride-PET and [18F]-FDG-PET showed progressive disease in the model, with 4-fold and 2-fold increases in SUV (p<0.05 by the study endpoint, respectively. In vivo bioluminescent assay showed that systemically delivered PEDF inhibited growth of primary osteosarcoma.DiscussionApplication of [18F]-Fluoride-PET and [18F]-FDG-PET to an established murine model of orthotopic osteosarcoma has improved the assessment of disease. The use of targeted imaging should prove beneficial for the evaluation of new approaches to osteosarcoma therapy.

  1. Blade Displacement Measurement Technique Applied to a Full-Scale Rotor Test

    Science.gov (United States)

    Abrego, Anita I.; Olson, Lawrence E.; Romander, Ethan A.; Barrows, Danny A.; Burner, Alpheus W.

    2012-01-01

    Blade displacement measurements using multi-camera photogrammetry were acquired during the full-scale wind tunnel test of the UH-60A Airloads rotor, conducted in the National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel. The objectives were to measure the blade displacement and deformation of the four rotor blades as they rotated through the entire rotor azimuth. These measurements are expected to provide a unique dataset to aid in the development and validation of rotorcraft prediction techniques. They are used to resolve the blade shape and position, including pitch, flap, lag and elastic deformation. Photogrammetric data encompass advance ratios from 0.15 to slowed rotor simulations of 1.0, thrust coefficient to rotor solidity ratios from 0.01 to 0.13, and rotor shaft angles from -10.0 to 8.0 degrees. An overview of the blade displacement measurement methodology and system development, descriptions of image processing, uncertainty considerations, preliminary results covering static and moderate advance ratio test conditions and future considerations are presented. Comparisons of experimental and computational results for a moderate advance ratio forward flight condition show good trend agreements, but also indicate significant mean discrepancies in lag and elastic twist. Blade displacement pitch measurements agree well with both the wind tunnel commanded and measured values.

  2. A Morphing Technique Applied to Lung Motions in Radiotherapy: Preliminary Results

    Directory of Open Access Journals (Sweden)

    R. Laurent

    2010-01-01

    Full Text Available Organ motion leads to dosimetric uncertainties during a patient’s treatment. Much work has been done to quantify the dosimetric effects of lung movement during radiation treatment. There is a particular need for a good description and prediction of organ motion. To describe lung motion more precisely, we have examined the possibility of using a computer technique: a morphing algorithm. Morphing is an iterative method which consists of blending one image into another image. To evaluate the use of morphing, Four Dimensions Computed Tomography (4DCT acquisition of a patient was performed. The lungs were automatically segmented for different phases, and morphing was performed using the end-inspiration and the end-expiration phase scans only. Intermediate morphing files were compared with 4DCT intermediate images. The results showed good agreement between morphing images and 4DCT images: fewer than 2 % of the 512 by 256 voxels were wrongly classified as belonging/not belonging to a lung section. This paper presents preliminary results, and our morphing algorithm needs improvement. We can infer that morphing offers considerable advantages in terms of radiation protection of the patient during the diagnosis phase, handling of artifacts, definition of organ contours and description of organ motion.

  3. Modern Chemistry Techniques Applied to Metal Behavior and Chelation in Medical and Environmental Systems ? Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, M; Andresen, B; Burastero, S R; Chiarappa-Zucca, M L; Chinn, S C; Coronado, P R; Gash, A E; Perkins, J; Sawvel, A M; Szechenyi, S C

    2005-02-03

    This report details the research and findings generated over the course of a 3-year research project funded by Lawrence Livermore National Laboratory (LLNL) Laboratory Directed Research and Development (LDRD). Originally tasked with studying beryllium chemistry and chelation for the treatment of Chronic Beryllium Disease and environmental remediation of beryllium-contaminated environments, this work has yielded results in beryllium and uranium solubility and speciation associated with toxicology; specific and effective chelation agents for beryllium, capable of lowering beryllium tissue burden and increasing urinary excretion in mice, and dissolution of beryllium contamination at LLNL Site 300; {sup 9}Be NMR studies previously unstudied at LLNL; secondary ionization mass spec (SIMS) imaging of beryllium in spleen and lung tissue; beryllium interactions with aerogel/GAC material for environmental cleanup. The results show that chelator development using modern chemical techniques such as chemical thermodynamic modeling, was successful in identifying and utilizing tried and tested beryllium chelators for use in medical and environmental scenarios. Additionally, a study of uranium speciation in simulated biological fluids identified uranium species present in urine, gastric juice, pancreatic fluid, airway surface fluid, simulated lung fluid, bile, saliva, plasma, interstitial fluid and intracellular fluid.

  4. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  5. Linear and non-linear control techniques applied to actively lubricated journal bearings

    Science.gov (United States)

    Nicoletti, R.; Santos, I. F.

    2003-03-01

    The main objectives of actively lubricated bearings are the simultaneous reduction of wear and vibration between rotating and stationary machinery parts. For reducing wear and dissipating vibration energy until certain limits, one can use the conventional hydrodynamic lubrication. For further reduction of shaft vibrations one can use the active lubrication action, which is based on injecting pressurized oil into the bearing gap through orifices machined in the bearing sliding surface. The design and efficiency of some linear (PD, PI and PID) and a non-linear controller, applied to a tilting-pad journal bearing, are analysed and discussed. Important conclusions about the application of integral controllers, responsible for changing the rotor-bearing equilibrium position and consequently the "passive" oil film damping coefficients, are achieved. Numerical results show an effective vibration reduction of unbalance response of a rigid rotor, where the PD and the non-linear P controllers show better performance for the frequency range of study (0-80 Hz). The feasibility of eliminating rotor-bearing instabilities (phenomena of whirl) by using active lubrication is also investigated, illustrating clearly one of its most promising applications.

  6. A non-intrusive measurement technique applying CARS for concentration measurement in a gas mixing flow

    CERN Document Server

    Yamamoto, Ken; Moriya, Madoka; Kuriyama, Reiko; Sato, Yohei

    2015-01-01

    Coherent anti-Stokes Raman scattering (CARS) microscope system was built and applied to a non-intrusive gas concentration measurement of a mixing flow in a millimeter-scale channel. Carbon dioxide and nitrogen were chosen as test fluids and CARS signals from the fluids were generated by adjusting the wavelengths of the Pump and the Stokes beams. The generated CARS signals, whose wavelengths are different from those of the Pump and the Stokes beams, were captured by an EM-CCD camera after filtering out the excitation beams. A calibration experiment was performed in order to confirm the applicability of the built-up CARS system by measuring the intensity of the CARS signal from known concentrations of the samples. After confirming that the measured CARS intensity was proportional to the second power of the concentrations as was theoretically predicted, the CARS intensities in the gas mixing flow channel were measured. Ten different measurement points were set and concentrations of both carbon dioxide and nitrog...

  7. Shopping For Danger: E-commerce techniques applied to collaboration in cyber security

    Energy Technology Data Exchange (ETDEWEB)

    Bruce, Joseph R.; Fink, Glenn A.

    2012-05-24

    Collaboration among cyber security analysts is essential to a successful protection strategy on the Internet today, but it is uncommonly practiced or encouraged in operating environments. Barriers to productive collaboration often include data sensitivity, time and effort to communicate, institutional policy, and protection of domain knowledge. We propose an ambient collaboration framework, Vulcan, designed to remove the barriers of time and effort and mitigate the others. Vulcan automated data collection, collaborative filtering, and asynchronous dissemination, eliminating the effort implied by explicit collaboration among peers. We instrumented two analytic applications and performed a mock analysis session to build a dataset and test the output of the system.

  8. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically. The effici...

  9. Surveillance of the nuclear instrumentation by a noise analysis technique

    International Nuclear Information System (INIS)

    The nuclear sensors used in the protection channels of a nuclear reactor, have to be tested periodically. A method has been developed to estimate the state of this kind of sensor. The method proposed applies to boron ionization chambers. The principle of this technique is based on the calculation of a specific parameter named a ''descriptor'', using a simple signal processing technique. A modification of this parameter indicates a degradation of the static and dynamic performances of the sensor. Different applications of the technique in a nuclear power plant are given

  10. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  11. Techniques that Link Extreme Events to the Large Scale, Applied to California Heat Waves

    Science.gov (United States)

    Grotjahn, R.

    2015-12-01

    Understanding the mechanisms how Californian Central Valley (CCV) summer extreme hot spells develop is very important since the events have major impacts on the economy and human safety. Results from a series of CCV heat wave studies will be presented, emphasizing the techniques used. Key larger scale elements are identified statistically that are also consistent with synoptic and dynamic understanding of what must be present during extreme heat. Beyond providing a clear synoptic explanation, these key elements have high predictability, in part because soil moisture has little annual variation in the heavily-irrigated CCV. In turn, the predictability naturally leads to an effective tool to assess climate model simulation of these heat waves in historical and future climate scenarios. (Does the model develop extreme heat for the correct reasons?) Further work identified that these large scale elements arise in two quite different ways: one from expansion southwestward of a pre-existing heat wave in southwest Canada, the other formed in place from parcels traversing the North Pacific. The pre-existing heat wave explains an early result showing correlation between heat waves in Sacramento California, and other locations along the US west coast, including distant Seattle Washington. CCV heat waves can be preceded by unusually strong tropical Indian Ocean and Indonesian convection, this partial link may occur through an Asian subtropical jet wave guide. Another link revealed by diagnostics is a middle and higher latitude source of wave activity in Siberia and East Asia that also leads to the development of the CCV heat wave. This talk will address as many of these results and the tools used to obtain them as is reasonable within the available time.

  12. BiasMDP: Carrier lifetime characterization technique with applied bias voltage

    Energy Technology Data Exchange (ETDEWEB)

    Jordan, Paul M., E-mail: paul.jordan@namlab.com; Simon, Daniel K.; Dirnstorfer, Ingo [Nanoelectronic Materials Laboratory gGmbH (NaMLab), Nöthnitzer Straße 64, 01187 Dresden (Germany); Mikolajick, Thomas [Nanoelectronic Materials Laboratory gGmbH (NaMLab), Nöthnitzer Straße 64, 01187 Dresden (Germany); Technische Universität Dresden, Institut für Halbleiter- und Mikrosystemtechnik, 01062 Dresden (Germany)

    2015-02-09

    A characterization method is presented, which determines fixed charge and interface defect densities in passivation layers. This method bases on a bias voltage applied to an electrode on top of the passivation layer. During a voltage sweep, the effective carrier lifetime is measured by means of microwave detected photoconductivity. When the external voltage compensates the electric field of the fixed charges, the lifetime drops to a minimum value. This minimum value correlates to the flat band voltage determined in reference impedance measurements. This correlation is measured on p-type silicon passivated by Al{sub 2}O{sub 3} and Al{sub 2}O{sub 3}/HfO{sub 2} stacks with different fixed charge densities and layer thicknesses. Negative fixed charges with densities of 3.8 × 10{sup 12 }cm{sup −2} and 0.7 × 10{sup 12 }cm{sup −2} are determined for Al{sub 2}O{sub 3} layers without and with an ultra-thin HfO{sub 2} interface, respectively. The voltage and illumination dependencies of the effective carrier lifetime are simulated with Shockley Read Hall surface recombination at continuous defects with parabolic capture cross section distributions for electrons and holes. The best match with the measured data is achieved with a very low interface defect density of 1 × 10{sup 10 }eV{sup −1} cm{sup −2} for the Al{sub 2}O{sub 3} sample with HfO{sub 2} interface.

  13. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  14. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  15. An evaluation of directional analysis techniques for multidirectional, partially reflected waves .1. numerical investigations

    DEFF Research Database (Denmark)

    Ilic, C; Chadwick, A; Helm-Petersen, Jacob

    2000-01-01

    Recent studies of advanced directional analysis techniques have mainly centred on incident wave fields. In the study of coastal structures, however, partially reflective wave fields are commonly present. In the near structure field, phase locked methods can be successfully applied. In the far fie...

  16. Applying Fuzzy Logic and Data Mining Techniques in Wireless Sensor Network for Determination Residential Fire Confidence

    Directory of Open Access Journals (Sweden)

    Mirjana Maksimović

    2014-09-01

    Full Text Available The main goal of soft computing technologies (fuzzy logic, neural networks, fuzzy rule-based systems, data mining techniques… is to find and describe the structural patterns in the data in order to try to explain connections between data and on their basis create predictive or descriptive models. Integration of these technologies in sensor nodes seems to be a good idea because it can significantly lead to network performances improvements, above all to reduce the energy consumption and enhance the lifetime of the network. The purpose of this paper is to analyze different algorithms in the case of fire confidence determination in order to see which of the methods and parameter values work best for the given problem. Hence, an analysis between different classification algorithms in a case of nominal and numerical d

  17. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    Science.gov (United States)

    Abtahi, Amir-Reza; Bijari, Afsane

    2016-09-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  18. APPLIED PHYTO-REMEDIATION TECHNIQUES USING HALOPHYTES FOR OIL AND BRINE SPILL SCARS

    Energy Technology Data Exchange (ETDEWEB)

    M.L. Korphage; Bruce G. Langhus; Scott Campbell

    2003-03-01

    Produced salt water from historical oil and gas production was often managed with inadequate care and unfortunate consequences. In Kansas, the production practices in the 1930's and 1940's--before statewide anti-pollution laws--were such that fluids were often produced to surface impoundments where the oil would segregate from the salt water. The oil was pumped off the pits and the salt water was able to infiltrate into the subsurface soil zones and underlying bedrock. Over the years, oil producing practices were changed so that segregation of fluids was accomplished in steel tanks and salt water was isolated from the natural environment. But before that could happen, significant areas of the state were scarred by salt water. These areas are now in need of economical remediation. Remediation of salt scarred land can be facilitated with soil amendments, land management, and selection of appropriate salt tolerant plants. Current research on the salt scars around the old Leon Waterflood, in Butler County, Kansas show the relative efficiency of remediation options. Based upon these research findings, it is possible to recommend cost efficient remediation techniques for slight, medium, and heavy salt water damaged soil. Slight salt damage includes soils with Electrical Conductivity (EC) values of 4.0 mS/cm or less. Operators can treat these soils with sufficient amounts of gypsum, install irrigation systems, and till the soil. Appropriate plants can be introduced via transplants or seeded. Medium salt damage includes soils with EC values between 4.0 and 16 mS/cm. Operators will add amendments of gypsum, till the soil, and arrange for irrigation. Some particularly salt tolerant plants can be added but most planting ought to be reserved until the second season of remediation. Severe salt damage includes soil with EC values in excess of 16 mS/cm. Operators will add at least part of the gypsum required, till the soil, and arrange for irrigation. The following

  19. Lipase immobilized by different techniques on various support materials applied in oil hydrolysis

    Directory of Open Access Journals (Sweden)

    VILMA MINOVSKA

    2005-04-01

    Full Text Available Batch hydrolysis of olive oil was performed by Candida rugosa lipase immobilized on Amberlite IRC-50 and Al2O3. These two supports were selected out of 16 carriers: inorganic materials (sand, silica gel, infusorial earth, Al2O3, inorganic salts (CaCO3, CaSO4, ion-exchange resins (Amberlite IRC-50 and IR-4B, Dowex 2X8, a natural resin (colophony, a natural biopolymer (sodium alginate, synthetic polymers (polypropylene, polyethylene and zeolites. Lipase immobilization was carried out by simple adsorption, adsorption followed by cross-linking, adsorption on ion-exchange resins, combined adsorption and precipitation, pure precipitation and gel entrapment. The suitability of the supports and techniques for the immobilization of lipase was evaluated by estimating the enzyme activity, protein loading, immobilization efficiency and reusability of the immobilizates. Most of the immobilizates exhibited either a low enzyme activity or difficulties during the hydrolytic reaction. Only those prepared by ionic adsorption on Amberlite IRC-50 and by combined adsorption and precipitation on Al2O3 showed better activity, 2000 and 430 U/g support, respectively, and demonstrated satisfactory behavior when used repeatedly. The hydrolysis was studied as a function of several parameters: surfactant concentration, enzyme concentration, pH and temperature. The immobilized preparation with Amberlite IRC-50 was stable and active in the whole range of pH (4 to 9 and temperature (20 to 50 °C, demonstrating a 99% degree of hydrolysis. In repeated usage, it was stable and active having a half-life of 16 batches, which corresponds to an operation time of 384 h. Its storage stability was remarkable too, since after 9 months it had lost only 25 % of the initial activity. The immobilizate with Al22O3 was less stable and less active. At optimal environmental conditions, the degree of hydrolysis did not exceed 79 %. In repeated usage, after the fourth batch, the degree of

  20. Statistical Mechanics Ideas and Techniques Applied to Selected Problems in Ecology

    Directory of Open Access Journals (Sweden)

    Hugo Fort

    2013-11-01

    Full Text Available Ecosystem dynamics provides an interesting arena for the application of a plethora concepts and techniques from statistical mechanics. Here I review three examples corresponding each one to an important problem in ecology. First, I start with an analytical derivation of clumpy patterns for species relative abundances (SRA empirically observed in several ecological communities involving a high number n of species, a phenomenon which have puzzled ecologists for decades. An interesting point is that this derivation uses results obtained from a statistical mechanics model for ferromagnets. Second, going beyond the mean field approximation, I study the spatial version of a popular ecological model involving just one species representing vegetation. The goal is to address the phenomena of catastrophic shifts—gradual cumulative variations in some control parameter that suddenly lead to an abrupt change in the system—illustrating it by means of the process of desertification of arid lands. The focus is on the aggregation processes and the effects of diffusion that combined lead to the formation of non trivial spatial vegetation patterns. It is shown that different quantities—like the variance, the two-point correlation function and the patchiness—may serve as early warnings for the desertification of arid lands. Remarkably, in the onset of a desertification transition the distribution of vegetation patches exhibits scale invariance typical of many physical systems in the vicinity a phase transition. I comment on similarities of and differences between these catastrophic shifts and paradigmatic thermodynamic phase transitions like the liquid-vapor change of state for a fluid. Third, I analyze the case of many species interacting in space. I choose tropical forests, which are mega-diverse ecosystems that exhibit remarkable dynamics. Therefore these ecosystems represent a research paradigm both for studies of complex systems dynamics as well as to

  1. Video analysis applied to volleyball didactics to improve sport skills

    OpenAIRE

    Raiola, Gaetano; Parisi, Fabio; Giugno, Ylenia; Di Tore, Pio Alfredo

    2013-01-01

    The feedback method is increasingly used in learning new skills and improving performance. "Recent research, however, showed that the most objective and quantitative feedback is, theº greater its effect on performance". The video analysis, which is the analysis of sports performance by watching the video, is used primarily for use in the quantitative performance of athletes through the notational analysis. It may be useful to combine the quantitative and qualitative analysis of the single ges...

  2. Microwave digestion techniques applied to determination of boron by ICP-AES in BNCT program

    International Nuclear Information System (INIS)

    Recently, boron neutron capture therapy (BNCT) has merged as an interesting option for the treatment of some kind of tumors where established therapies show no success. A molecular boronated species, enriched in 10B is administrated to the subject; it localizes in malignant tissues depending the kind of tumor and localization. Therefore, a very important fact in BNCT research is the detection of boron at trace or ultra trace levels precisely and accurately. This is extremely necessary as boronated species do localize in tumoral tissue and also localize in liver, kidney, spleen, skin, membranes. By this way, before testing a boronated species, it is mandatory to determine its biodistribution in a statistically meaning population, that is related with managing of a great number of samples. In the other hand, it is necessary to exactly predict when to begin the irradiation and to determine the magnitude of radiation to obtain the desired radiological dose for a specified mean boron concentration. This involves the determination of boron in whole blood, which is related with boron concentration in the tumor object of treatment. The methodology selected for the analysis of boron in whole blood and tissues must join certain characteristics: it must not be dependant of the chemical form of boron, it has to be fast and capable to determine boron accurately and precisely in a wide range of concentrations. The design and validation of experimental models involving animals in BNCT studies and the determination of boron in blood of animals and subjects upon treatment require reliable analytical procedures to determine boron quantitatively in those biologic materials. Inductively coupled plasma-atomic emission spectrometry (ICP-AES) using pneumatic nebulization is one of the most promising methods for boron analysis, but the sample must be liquid and have low solid concentration. In our case, biological tissues and blood, it is mandatory to mineralize and/or dilute samples

  3. Maximizing setup accuracy using portal images as applied to a conformal boost technique for prostatic cancer

    Energy Technology Data Exchange (ETDEWEB)

    Bijhold, J.; Lebesque, J.V.; Hart, A.A.M.; Vijlbrief, R.E. (Nederlands Kanker Inst. ' Antoni van Leeuwenhoekhuis' , Amsterdam (Netherlands))

    1992-08-01

    A design procedure of a patients setup verification protocol based upon frequent digital acquisition of portal images is demonstrated with an application for conformal prostatic boost fields. The protocol aims at the elimination of large systematic deviations in the patient setup and includes decision rules which indicate when correction of the patient setup is needed. The decision rules were derived from the results of a theoretical and quantitative analysis of patient setup variations measured in three pelvic fields (one anterior-posterior and two lateral fields) of 105 fractions for nine patients. Deviations in the patients positioning, derived from one field, were quantified as two-dimensional (2-D) displacement vectors in the plane perpendicular to the beam axis by alignment of anatomical features in the portal and the simulator image. The magnitude of the overall setup variations along the anterior-posterior, superior-inferior and lateral directions varied between 2.6 and 3 mm (1 S.D). In addition, intra-treatment variations appeared to be predictable which was a prerequisite for the development of the decision rules. The 2-D setup deviations, measured in three fields of one fraction were strongly correlated and a 3-D displacement vector was calculated. Utilization of this 3-D vector in a setup verification protocol may lead to an early detection of systematic setup deviations. (author). 19 refs., 5 figs., 3 tabs.

  4. Image Processing Techniques applied to Liquid Argon Time Projection Chamber Data

    Science.gov (United States)

    Esquivel, Jessica; MicroBooNE Collaboration

    2015-04-01

    Large scale Liquid Argon Time Projection Chambers(LArTPC), like MicroBooNE, offer new ways to study neutrino cross sections and neutrino oscillations. The data from these LArTPCs are very detailed images of charged particles passing through the detector. A plethora of hit finding, cluster finding and tracking algorithms have been implemented to process data coming from MicroBooNE, but it is still possible that particle tracks that are easily visible by eye are being missed during data processing. Because the human eye sometimes does a better job at finding particle tracks that are sometimes missed by data processing, using Image Processing algorithms which emulate the human eye in conjunction with the already implemented algorithms could be beneficial. In particular Edge Detection algorithms could be useful due to the fact that tracks will often have defined deposited energy along straight lines. This talk will cover preliminary data processed with Edge Detection algorithms, and discussion of what the potential benefits are to this approach to LArTPC data analysis. On behalf of the MicroBooNE Collaboration.

  5. Quality control of an important petroleum refinery equipment by applying non destructive techniques

    International Nuclear Information System (INIS)

    Some testing have been carried out in order to determine the material conditions of a cap belonging to a regenerator from a catalytic cracking plant to be welded. The material employed in the construction of the cap was the a-285 c astm steel. This regenerator has worked about forty years. The objectives of this work were: to measure the in situ hardness and to realise the in situ metallographic analysis too of the cap, in order to determine the capability of this part to be cut and then welded without a deteriorates of the material. If the material of the cap maintains good properties; then, the welding and pre- and/or post-heat treatment technologies will be prepared. After the investigations, we have arrived to the following conclusions and recommendations: The studied cap steel maintains the good initial quality to be cut and then welded, because of the material hasn't suffered phase transformations during the time of work. Also, the levels of inclusions are low and don't have affected the material.It has been recommended to use at the begin of the welding the arc manual electric method with a cellulocic electrode (aws e-6010, f 3,25 mm and 65-130 a) and then the basic electrode aws e-7018, with a diameter of 4 mm and current intensity between 130 and 190 a. It has been recommended too, to pre-heat the parts to be welded about 1500C

  6. Control of an important petroleum refinery equipment by applying non destructive techniques

    International Nuclear Information System (INIS)

    Some testing have been carried out in order to determine the material conditions of a cap belonging to a regenerator from a catalytic cracking plant to be welded. The material employed in the construction of the cap was the A-285 C ASTM steel. This regenerator has worked about forty years. The objectives of this work were: To measure the in situ hardness and to realise the in situ metallographic analysis too of the cap, in order to determine the capability of this part to be cut and then welded without a deteriorus of the material. If the material of the cap maintains good properties; then, the welding and pre- and/or post-heat treatment technologies will be prepared. After the investigations, we have arrived to the following conclusions and recommendations: The studied cap steel maintains the good initial quality to be cut and then welded, because of the material hasn't suffered phase transformations during the time of work. Also, the levels of inclusions are low and don't have affected the material. It has been recommended to use at the begin of the welding the arc manual electric method with a cellulocic electrode (AWS E-6010, f 3,25 mm and 65-130 A) and then the basic electrode AWS E-7018, with a diameter of 4 mm and current intensity between 130 and 190 A. It has been recommended too, to pre-heat the parts to be welded about 1500C

  7. Multivariate Cross-Classification: Applying machine learning techniques to characterize abstraction in neural representations

    Directory of Open Access Journals (Sweden)

    Jonas eKaplan

    2015-03-01

    Full Text Available Here we highlight an emerging trend in the use of machine learning classifiers to test for abstraction across patterns of neural activity. When a classifier algorithm is trained on data from one cognitive context, and tested on data from another, conclusions can be drawn about the role of a given brain region in representing information that abstracts across those cognitive contexts. We call this kind of analysis Multivariate Cross-Classification (MVCC, and review several domains where it has recently made an impact. MVCC has been important in establishing correspondences among neural patterns across cognitive domains, including motor-perception matching and cross-sensory matching. It has been used to test for similarity between neural patterns evoked by perception and those generated from memory. Other work has used MVCC to investigate the similarity of representations for semantic categories across different kinds of stimulus presentation, and in the presence of different cognitive demands. We use these examples to demonstrate the power of MVCC as a tool for investigating neural abstraction and discuss some important methodological issues related to its application.

  8. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  9. A Review of Temporal Aspects of Hand Gesture Analysis Applied to Discourse Analysis and Natural Conversation

    Directory of Open Access Journals (Sweden)

    Renata C. B. Madeo

    2013-08-01

    Full Text Available Lately, there has been an increasinginterest in hand gesture analysis systems. Recent works have employedpattern recognition techniques and have focused on the development of systems with more natural userinterfaces. These systems may use gestures to control interfaces or recognize sign language gestures, whichcan provide systems with multimodal interaction; or consist in multimodal tools to help psycholinguists tounderstand new aspects of discourse analysis and to automate laborious tasks.Gestures are characterizedby several aspects, mainly by movementsand sequence of postures. Since data referring to movementsorsequencescarry temporal information, this paper presents aliteraturereviewabouttemporal aspects ofhand gesture analysis, focusing on applications related to natural conversation and psycholinguisticanalysis, using Systematic Literature Review methodology. In our results, we organized works according totype of analysis, methods, highlighting the use of Machine Learning techniques, and applications.

  10. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  11. An Analysis of the Economy Principle Applied in Cyber Language

    Institute of Scientific and Technical Information of China (English)

    肖钰敏

    2015-01-01

    With the development of network technology,cyber language,a new social dialect,is widely used in our life.The author analyzes how the economy principle is applied in cyber language from three aspects—word-formation,syntax and non-linguistic symbol.And the author collects,summarizes and analyzes the relevant language materials to prove the economy principle’s real existence in chat room and the reason why the economy principle is applied widely in cyber space.

  12. Fielding the magnetically applied pressure-shear technique on the Z accelerator (completion report for MRT 4519).

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, C. Scott; Haill, Thomas A.; Dalton, Devon Gardner; Rovang, Dean Curtis; Lamppa, Derek C.

    2013-09-01

    The recently developed Magnetically Applied Pressure-Shear (MAPS) experimental technique to measure material shear strength at high pressures on magneto-hydrodynamic (MHD) drive pulsed power platforms was fielded on August 16, 2013 on shot Z2544 utilizing hardware set A0283A. Several technical and engineering challenges were overcome in the process leading to the attempt to measure the dynamic strength of NNSA Ta at 50 GPa. The MAPS technique relies on the ability to apply an external magnetic field properly aligned and time correlated with the MHD pulse. The load design had to be modified to accommodate the external field coils and additional support was required to manage stresses from the pulsed magnets. Further, this represents the first time transverse velocity interferometry has been applied to diagnose a shot at Z. All subsystems performed well with only minor issues related to the new feed design which can be easily addressed by modifying the current pulse shape. Despite the success of each new component, the experiment failed to measure strength in the samples due to spallation failure, most likely in the diamond anvils. To address this issue, hydrocode simulations are being used to evaluate a modified design using LiF windows to minimize tension in the diamond and prevent spall. Another option to eliminate the diamond material from the experiment is also being investigated.

  13. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  14. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  15. An applied general equilibrium model for Dutch agribusiness policy analysis.

    NARCIS (Netherlands)

    Peerlings, J.H.M.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly

  16. System Analysis Applying to Talent Resource Development Research

    Institute of Scientific and Technical Information of China (English)

    WANG Peng-tao; ZHENG Gang

    2001-01-01

    In the development research of talent resource, the most important of talent resource forecast and optimization is the structure of talent resource, requirement number and talent quality. The article establish factor reconstruction analysis forecast and talent quality model on the method: system reconstruction analysis and ensure most effective factor level in system, which is presented by G. J. Klirti, B.Jonesque. And performing dynamic analysis of example ration.

  17. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing the dis...

  18. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  19. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  20. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  1. 分子生物学技术在瘤胃厌氧真菌分类多样性与定量分析研究中的应用%Modern Molecular Techniques Applied in Microbial Diversity and Quantitative Analysis of Anaerobic Fungi in the Rumen: A Review

    Institute of Scientific and Technical Information of China (English)

    沈博通; 曹阳春; 杨红建

    2011-01-01

    Anaerobic fungi play a significant role in promoting the degradation of fibrous feed in the rumen. Research advances in last decades were reviewed in biological taxonomy, classification and life circle of rumen fungi. The rapid development and application of molecular biology methods, including rDNA sequence analysis, restriction fragment length polymorphisms (RFLP) and automated ribosomal intergenic spacer analysis (ARISA), have brought a new pathway to update the study of rumen fungi diversity. Due to high sensitivity and specificity, one or several efficient techniques have been integrated in the taxonomical classification and molecular phylogenetic analysis for rumen fungi, and quantitative analysis of fungi biomass by a real-time polymerase chain reaction (PCR) technique has been successfully used to monitor anaerobic fungi in rumen micro-ecology circumstance.%栖居在瘤胃中的厌氧真菌对饲料纤维的降解具有重要的促进作用.围绕瘤胃厌氧真菌的生物学分类以及生活史,对国内外研究进展进行了综述分析.核糖体DNA的序列分析、限制性片段长度多态性(RFLP)、核糖体间隔基因自动分析(ARISA)等分子生物学技术的快速发展与应用为厌氧真菌分类多样性研究开辟了新的途径.这些快速、易于操作的新技术,因其所具有高度灵敏性和特异性,而被广泛用于对瘤胃厌氧真菌的分类鉴定和系统进化分析中,与此同时,实时定量PCR技术目前已经被广泛接受并用来定量监测瘤胃微生态环境下的真菌生物量.

  2. Analysis of OFDM Applied to Powerline High Speed Digital Communication

    Institute of Scientific and Technical Information of China (English)

    ZHUANG Jian; YANG Gong-xu

    2003-01-01

    The low voltage powerline is becoming a powerful solution to home network, building automation, and internet access as a result of its wide distribution, easy access and little maintenance. The character of powerline channel is very complicated because it is an open net. This article analysed the character of the powerline channel,introduced the basics of OFDM(Orthogonal Frequency Division Multiplexing), and studied the OFDM applied into powerline high speed digital communication.

  3. An applied general equilibrium model for Dutch agribusiness policy analysis.

    OpenAIRE

    Peerlings, J.H.M.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly general and could be used to analyse a great variety of agricultural policy changes. However, generality requires that the model should be adapted and extended for special research questions. This...

  4. Multisectorial models applied to the environment: an analysis for catalonia

    OpenAIRE

    Pié Dols, Laia

    2010-01-01

    The objective of this doctoral thesis is to apply different multisectorial models available to analyse the impact that would had on the Catalan economy as a result of the introduction of policies designed to reduce emissions of greenhouse effect gases and save energy, and also at the same time to improve the environmental competitiveness of both individual companies and the economy as a whole. For the purposes of this thesis I have analysed the six greenhouse gases that are regulated by the K...

  5. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  6. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  7. Different spectrophotometric methods applied for the analysis of binary mixture of flucloxacillin and amoxicillin: A comparative study

    Science.gov (United States)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-05-01

    Three different spectrophotometric methods were applied for the quantitative analysis of flucloxacillin and amoxicillin in their binary mixture, namely, ratio subtraction, absorbance subtraction and amplitude modulation. A comparative study was done listing the advantages and the disadvantages of each method. All the methods were validated according to the ICH guidelines and the obtained accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can be used for the routine analysis of flucloxacillin and amoxicillin in their binary mixtures.

  8. Applying Galois compliance for data analysis in information systems

    Directory of Open Access Journals (Sweden)

    Kozlov Sergey

    2016-03-01

    Full Text Available The article deals with the data analysis in information systems. The author discloses the possibility of using Galois compliance to identify the characteristics of the information system structure. The author reveals the specificity of the application of Galois compliance for the analysis of information system content with the use of invariants of graph theory. Aspects of introduction of mathematical apparatus of Galois compliance for research of interrelations between elements of the adaptive training information system of individual testing are analyzed.

  9. Applied network security monitoring collection, detection, and analysis

    CERN Document Server

    Sanders, Chris

    2013-01-01

    Applied Network Security Monitoring is the essential guide to becoming an NSM analyst from the ground up. This book takes a fundamental approach to NSM, complete with dozens of real-world examples that teach you the key concepts of NSM. Network security monitoring is based on the principle that prevention eventually fails. In the current threat landscape, no matter how much you try, motivated attackers will eventually find their way into your network. At that point, it is your ability to detect and respond to that intrusion that can be the difference between a small incident and a major di

  10. Applying a nonlinear, pitch-catch, ultrasonic technique for the detection of kissing bonds in friction stir welds.

    Science.gov (United States)

    Delrue, Steven; Tabatabaeipour, Morteza; Hettler, Jan; Van Den Abeele, Koen

    2016-05-01

    Friction stir welding (FSW) is a promising technology for the joining of aluminum alloys and other metallic admixtures that are hard to weld by conventional fusion welding. Although FSW generally provides better fatigue properties than traditional fusion welding methods, fatigue properties are still significantly lower than for the base material. Apart from voids, kissing bonds for instance, in the form of closed cracks propagating along the interface of the stirred and heat affected zone, are inherent features of the weld and can be considered as one of the main causes of a reduced fatigue life of FSW in comparison to the base material. The main problem with kissing bond defects in FSW, is that they currently are very difficult to detect using existing NDT methods. Besides, in most cases, the defects are not directly accessible from the exposed surface. Therefore, new techniques capable of detecting small kissing bond flaws need to be introduced. In the present paper, a novel and practical approach is introduced based on a nonlinear, single-sided, ultrasonic technique. The proposed inspection technique uses two single element transducers, with the first transducer transmitting an ultrasonic signal that focuses the ultrasonic waves at the bottom side of the sample where cracks are most likely to occur. The large amount of energy at the focus activates the kissing bond, resulting in the generation of nonlinear features in the wave propagation. These nonlinear features are then captured by the second transducer operating in pitch-catch mode, and are analyzed, using pulse inversion, to reveal the presence of a defect. The performance of the proposed nonlinear, pitch-catch technique, is first illustrated using a numerical study of an aluminum sample containing simple, vertically oriented, incipient cracks. Later, the proposed technique is also applied experimentally on a real-life friction stir welded butt joint containing a kissing bond flaw. PMID:26921559

  11. Applying a nonlinear, pitch-catch, ultrasonic technique for the detection of kissing bonds in friction stir welds.

    Science.gov (United States)

    Delrue, Steven; Tabatabaeipour, Morteza; Hettler, Jan; Van Den Abeele, Koen

    2016-05-01

    Friction stir welding (FSW) is a promising technology for the joining of aluminum alloys and other metallic admixtures that are hard to weld by conventional fusion welding. Although FSW generally provides better fatigue properties than traditional fusion welding methods, fatigue properties are still significantly lower than for the base material. Apart from voids, kissing bonds for instance, in the form of closed cracks propagating along the interface of the stirred and heat affected zone, are inherent features of the weld and can be considered as one of the main causes of a reduced fatigue life of FSW in comparison to the base material. The main problem with kissing bond defects in FSW, is that they currently are very difficult to detect using existing NDT methods. Besides, in most cases, the defects are not directly accessible from the exposed surface. Therefore, new techniques capable of detecting small kissing bond flaws need to be introduced. In the present paper, a novel and practical approach is introduced based on a nonlinear, single-sided, ultrasonic technique. The proposed inspection technique uses two single element transducers, with the first transducer transmitting an ultrasonic signal that focuses the ultrasonic waves at the bottom side of the sample where cracks are most likely to occur. The large amount of energy at the focus activates the kissing bond, resulting in the generation of nonlinear features in the wave propagation. These nonlinear features are then captured by the second transducer operating in pitch-catch mode, and are analyzed, using pulse inversion, to reveal the presence of a defect. The performance of the proposed nonlinear, pitch-catch technique, is first illustrated using a numerical study of an aluminum sample containing simple, vertically oriented, incipient cracks. Later, the proposed technique is also applied experimentally on a real-life friction stir welded butt joint containing a kissing bond flaw.

  12. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  13. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  14. DATA MINING TECHNIQUES: A SOURCE FOR CONSUMER BEHAVIOR ANALYSIS

    Directory of Open Access Journals (Sweden)

    Abhijit Raorane

    2011-09-01

    Full Text Available Various studies on consumer purchasing behaviors have been presented and used in real problems. Datamining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, thedata mining method has disadvantages as well as advantages.Therefore, it is important to selectappropriate techniques to mine databases. The objective of this paper is to know consumer behavior, hispsychological condition at the time of purchase and how suitable data mining method apply to improveconventional method. Moreover, in an experiment, association rule is employed to mine rules for trustedcustomers using sales data in a super market industry

  15. Lutz's spontaneous sedimentation technique and the paleoparasitological analysis of sambaqui (shell mound sediments

    Directory of Open Access Journals (Sweden)

    Morgana Camacho

    2013-04-01

    Full Text Available Parasite findings in sambaquis (shell mounds are scarce. Although the 121 shell mound samples were previously analysed in our laboratory, we only recently obtained the first positive results. In the sambaqui of Guapi, Rio de Janeiro, Brazil, paleoparasitological analysis was performed on sediment samples collected from various archaeological layers, including the superficial layer as a control. Eggs of Acanthocephala, Ascaridoidea and Heterakoidea were found in the archaeological layers. We applied various techniques and concluded that Lutz's spontaneous sedimentation technique is effective for concentrating parasite eggs in sambaqui soil for microscopic analysis.

  16. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  17. Applying real options analysis to assess cleaner energy development strategies

    International Nuclear Information System (INIS)

    The energy industry, accounts for the largest portion of CO2 emissions, is facing the issue of compliance with the national clean energy policy. The methodology for evaluating the energy mix policy is crucial because of the characteristics of lead time embedded with the power generation facilities investment and the uncertainty of future electricity demand. In this paper, a modified binomial model based on sequential compound options, which may account for the lead time and uncertainty as a whole is established, and a numerical example on evaluating the optional strategies and the strategic value of the cleaner energy policy is also presented. It is found that the optimal decision at some nodes in the binomial tree is path dependent, which is different from the standard sequential compound option model with lead time or time lag concept. The proposed modified binomial sequential compound real options model can be generalized and extensively applied to solve the general decision problems that deal with the long lead time of many government policies as well as capital intensive investments. - Highlights: → Introducing a flexible strategic management approach for government policy making. → Developing a modified binomial real options model based on sequential compound options. → Proposing an innovative model for managing the long term policy with lead time. → Applying to evaluate the options of various scenarios of cleaner energy strategies.

  18. The colour analysis method applied to homogeneous rocks

    Science.gov (United States)

    Halász, Amadé; Halmai, Ákos

    2015-12-01

    Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.

  19. Dynamic analysis of large structures by modal synthesis techniques.

    Science.gov (United States)

    Hurty, W. C.; Hart, G. C.; Collins, J. D.

    1971-01-01

    Several criteria that may be used to evaluate the merits of some of the existing techniques for the dynamic analysis of large structures which involve division into substructures or components are examined. These techniques make use of component displacement modes to synthetize global systems of generalized coordinates and, for that reason, they have come to be known as modal synthesis or component mode methods. Two techniques have been found to be particularly useful - i.e., the modal synthesis method with fixed attachment modes, and the modal synthesis method with free attachment modes. These two methods are treated in detail, and general flow charts are presented for guidance in computer programming.

  20. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.