WorldWideScience

Sample records for analysis techniques applied

  1. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  2. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  3. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  4. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  5. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  6. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  7. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  8. Applied ALARA techniques

    Energy Technology Data Exchange (ETDEWEB)

    Waggoner, L.O.

    1998-02-05

    The presentation focuses on some of the time-proven and new technologies being used to accomplish radiological work. These techniques can be applied at nuclear facilities to reduce radiation doses and protect the environment. The last reactor plants and processing facilities were shutdown and Hanford was given a new mission to put the facilities in a safe condition, decontaminate, and prepare them for decommissioning. The skills that were necessary to operate these facilities were different than the skills needed today to clean up Hanford. Workers were not familiar with many of the tools, equipment, and materials needed to accomplish:the new mission, which includes clean up of contaminated areas in and around all the facilities, recovery of reactor fuel from spent fuel pools, and the removal of millions of gallons of highly radioactive waste from 177 underground tanks. In addition, this work has to be done with a reduced number of workers and a smaller budget. At Hanford, facilities contain a myriad of radioactive isotopes that are 2048 located inside plant systems, underground tanks, and the soil. As cleanup work at Hanford began, it became obvious early that in order to get workers to apply ALARA and use hew tools and equipment to accomplish the radiological work it was necessary to plan the work in advance and get radiological control and/or ALARA committee personnel involved early in the planning process. Emphasis was placed on applying,ALARA techniques to reduce dose, limit contamination spread and minimize the amount of radioactive waste generated. Progress on the cleanup has,b6en steady and Hanford workers have learned to use different types of engineered controls and ALARA techniques to perform radiological work. The purpose of this presentation is to share the lessons learned on how Hanford is accomplishing radiological work.

  9. Wavelets, Curvelets and Multiresolution Analysis Techniques Applied to Implosion Symmetry Characterization of ICF Targets

    CERN Document Server

    Afeyan, Bedros; Starck, Jean Luc; Cuneo, Michael

    2012-01-01

    We introduce wavelets, curvelets and multiresolution analysis techniques to assess the symmetry of X ray driven imploding shells in ICF targets. After denoising X ray backlighting produced images, we determine the Shell Thickness Averaged Radius (STAR) of maximum density, r*(N, {\\theta}), where N is the percentage of the shell thickness over which to average. The non-uniformities of r*(N, {\\theta}) are quantified by a Legendre polynomial decomposition in angle, {\\theta}. Undecimated wavelet decompositions outperform decimated ones in denoising and both are surpassed by the curvelet transform. In each case, hard thresholding based on noise modeling is used. We have also applied combined wavelet and curvelet filter techniques with variational minimization as a way to select the significant coefficients. Gains are minimal over curvelets alone in the images we have analyzed.

  10. A comparative assessment of texture analysis techniques applied to bone tool use-wear

    Science.gov (United States)

    Watson, Adam S.; Gleason, Matthew A.

    2016-06-01

    The study of bone tools, a specific class of artifacts often essential to perishable craft production, provides insight into industries otherwise largely invisible archaeologically. Building on recent breakthroughs in the analysis of microwear, this research applies confocal laser scanning microscopy and texture analysis techniques drawn from the field of surface metrology to identify use-wear patterns on experimental and archaeological bone artifacts. Our approach utilizes both conventional parameters and multi-scale geometric characterizations of the areas of worn surfaces to identify statistical similarities as a function of scale. The introduction of this quantitative approach to the study of microtopography holds significant potential for advancement in use-wear studies by reducing inter-observer variability and identifying new parameters useful in the detection of differential wear-patterns.

  11. Spatial analysis techniques applied to uranium prospecting in Chihuahua State, Mexico

    Science.gov (United States)

    Hinojosa de la Garza, Octavio R.; Montero Cabrera, María Elena; Sanín, Luz H.; Reyes Cortés, Manuel; Martínez Meyer, Enrique

    2014-07-01

    To estimate the distribution of uranium minerals in Chihuahua, the advanced statistical model "Maximun Entropy Method" (MaxEnt) was applied. A distinguishing feature of this method is that it can fit more complex models in case of small datasets (x and y data), as is the location of uranium ores in the State of Chihuahua. For georeferencing uranium ores, a database from the United States Geological Survey and workgroup of experts in Mexico was used. The main contribution of this paper is the proposal of maximum entropy techniques to obtain the mineral's potential distribution. For this model were used 24 environmental layers like topography, gravimetry, climate (worldclim), soil properties and others that were useful to project the uranium's distribution across the study area. For the validation of the places predicted by the model, comparisons were done with other research of the Mexican Service of Geological Survey, with direct exploration of specific areas and by talks with former exploration workers of the enterprise "Uranio de Mexico". Results. New uranium areas predicted by the model were validated, finding some relationship between the model predictions and geological faults. Conclusions. Modeling by spatial analysis provides additional information to the energy and mineral resources sectors.

  12. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    2010-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  13. Downscaling Statistical Model Techniques for Climate Change Analysis Applied to the Amazon Region

    Directory of Open Access Journals (Sweden)

    David Mendes

    2014-01-01

    Full Text Available The Amazon is an area covered predominantly by dense tropical rainforest with relatively small inclusions of several other types of vegetation. In the last decades, scientific research has suggested a strong link between the health of the Amazon and the integrity of the global climate: tropical forests and woodlands (e.g., savannas exchange vast amounts of water and energy with the atmosphere and are thought to be important in controlling local and regional climates. Consider the importance of the Amazon biome to the global climate changes impacts and the role of the protected area in the conservation of biodiversity and state-of-art of downscaling model techniques based on ANN Calibrate and run a downscaling model technique based on the Artificial Neural Network (ANN that is applied to the Amazon region in order to obtain regional and local climate predicted data (e.g., precipitation. Considering the importance of the Amazon biome to the global climate changes impacts and the state-of-art of downscaling techniques for climate models, the shower of this work is presented as follows: the use of ANNs good similarity with the observation in the cities of Belém and Manaus, with correlations of approximately 88.9% and 91.3%, respectively, and spatial distribution, especially in the correction process, representing a good fit.

  14. A new technique for fractal analysis applied to human, intracerebrally recorded, ictal electroencephalographic signals.

    Science.gov (United States)

    Bullmore, E; Brammer, M; Alarcon, G; Binnie, C

    1992-11-09

    Application of a new method of fractal analysis to human, intracerebrally recorded, ictal electroencephalographic (EEG) signals is reported. 'Frameshift-Richardson' (FR) analysis involves estimation of fractal dimension (1 EEG data; it is suggested that this technique offers significant operational advantages over use of algorithms for FD estimation requiring preliminary reconstruction of EEG data in phase space. FR analysis was found to reduce substantially the volume of EEG data, without loss of diagnostically important information concerning onset, propagation and evolution of ictal EEG discharges. Arrhythmic EEG events were correlated with relatively increased FD; rhythmic EEG events with relatively decreased FD. It is proposed that development of this method may lead to: (i) enhanced definition and localisation of initial ictal changes in the EEG presumed due to multi-unit activity; and (ii) synoptic visualisation of long periods of EEG data.

  15. Multivariation calibration techniques applied to NIRA (near infrared reflectance analysis) and FTIR (Fourier transform infrared) data

    Science.gov (United States)

    Long, C. L.

    1991-02-01

    Multivariate calibration techniques can reduce the time required for routine testing and can provide new methods of analysis. Multivariate calibration is commonly used with near infrared reflectance analysis (NIRA) and Fourier transform infrared (FTIR) spectroscopy. Two feasibility studies were performed to determine the capability of NIRA, using multivariate calibration techniques, to perform analyses on the types of samples that are routinely analyzed at this laboratory. The first study performed included a variety of samples and indicated that NIRA would be well-suited to perform analyses on selected materials properties such as water content and hydroxyl number on polyol samples, epoxy content on epoxy resins, water content of desiccants, and the amine values of various amine cure agents. A second study was performed to assess the capability of NIRA to perform quantitative analysis of hydroxyl numbers and water contents of hydroxyl-containing materials. Hydroxyl number and water content were selected for determination because these tests are frequently run on polyol materials and the hydroxyl number determination is time consuming. This study pointed out the necessity of obtaining calibration standards identical to the samples being analyzed for each type of polyol or other material being analyzed. Multivariate calibration techniques are frequently used with FTIR data to determine the composition of a large variety of complex mixtures. A literature search indicated many applications of multivariate calibration to FTIR data. Areas identified where quantitation by FTIR would provide a new capability are quantitation of components in epoxy and silicone resins, polychlorinated biphenyls (PCBs) in oils, and additives to polymers.

  16. A Comparative Analysis of the 'Green' Techniques Applied for Polyphenols Extraction from Bioresources.

    Science.gov (United States)

    Talmaciu, Adina Iulia; Volf, Irina; Popa, Valentin I

    2015-11-01

    From all the valuable biomass extractives, polyphenols are a widespread group of secondary metabolites found in all plants, representing the most desirable phytochemicals due to their potential to be used as additives in food industry, cosmetics, medicine, and others fields. At present, there is an increased interest to recover them from plant of spontaneous flora, cultivated plant, and wastes resulted in agricultural and food industry. That is why many efforts have been made to provide a highly sensitive, efficiently, and eco-friendly methods, for the extraction of polyphenols, according to the green chemistry and sustainable development concepts. Many extraction procedures are known with advantages and disadvantages. From these reasons, the aim of this article is to provide a comparative analysis regarding technical and economical aspects related to the most innovative extraction techniques studied in the last time: microwave-assisted extraction (MAE), supercritical fluid extraction (SFE), and ultrasound-assisted extraction (UAE).

  17. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  18. Hyphenated GC-FTIR and GC-MS techniques applied in the analysis of bioactive compounds

    Science.gov (United States)

    Gosav, Steluta; Paduraru, Nicoleta; Praisler, Mirela

    2014-08-01

    The drugs of abuse, which affect human nature and cause numerous crimes, have become a serious problem throughout the world. There are hundreds of amphetamine analogues on the black market. They consist of various alterations of the basic amphetamine molecular structure, which are yet not yet included in the lists of forbidden compounds although they retain or slightly modify the hallucinogenic effects of their parent compound. It is their important variety that makes their identification quite a challenge. A number of analytical procedures for the identification of amphetamines and their analogues have recently been reported. We are presenting the profile of the main hallucinogenic amphetamines obtained with the hyphenated techniques that are recommended for the identification of illicit amphetamines, i. e. gas chromatography combined with mass spectrometry (GC-MS) and gas chromatography coupled with Fourier transform infrared spectrometry (GC-FTIR). The infrared spectra of the analyzed hallucinogenic amphetamines present some absorption bands (1490 cm-1, 1440 cm-1, 1245 cm-1, 1050 cm-1 and 940 cm-1) that are very stable as position and shape, while their intensity depends of the side-chain substitution. The specific ionic fragment of the studied hallucinogenic compounds is the 3,4-methylenedioxybenzyl cation (m/e = 135) which has a small relative abundance (lesser than 20%). The complementarity of the above mentioned techniques for the identification of hallucinogenic compounds is discussed.

  19. Digital image processing techniques applied to pressure analysis and morphological features extraction in footprints.

    Science.gov (United States)

    Buchelly, F. J.; Mayorca, D.; Ballarín, V.; Pastore, J.

    2016-04-01

    This paper shows the correlation between foot morphology and pressure distribution on footplant by means of a morphological parameters analysis and pressure calculation. Footprint images were acquired using an optical pedobarograph and then processed for obtaining binary masks and intensity images in gray scale. Morphological descriptors were obtained from the binary images and the Hernandez Corvo (HC) index was automatically calculated for determine the type of foot. Pressure distributions were obtained from gray scale images making a correspondence between light intensity in footprints and pressure. Pressure analysis was performed by finding the maximum pressure, the mean pressure and the ratio between them that determines the uniformity of the distribution. Finally, a high correlation was found between this ratio and the type of foot determined by HC index.

  20. Morphological analysis of the flippers in the Franciscana dolphin, Pontoporia blainvillei, applying X-ray technique.

    Science.gov (United States)

    Del Castillo, Daniela Laura; Panebianco, María Victoria; Negri, María Fernanda; Cappozzo, Humberto Luis

    2014-07-01

    Pectoral flippers of cetaceans function to provide stability and maneuverability during locomotion. Directional asymmetry (DA) is a common feature among odontocete cetaceans, as well as sexual dimorphism (SD). For the first time DA, allometry, physical maturity, and SD of the flipper skeleton--by X-ray technique--of Pontoporia blainvillei were analyzed. The number of carpals, metacarpals, phalanges, and morphometric characters from the humerus, radius, ulna, and digit two were studied in franciscana dolphins from Buenos Aires, Argentina. The number of visible epiphyses and their degree of fusion at the proximal and distal ends of the humerus, radius, and ulna were also analyzed. The flipper skeleton was symmetrical, showing a negative allometric trend, with similar growth patterns in both sexes with the exception of the width of the radius (P ≤ 0.01). SD was found on the number of phalanges of digit two (P ≤ 0.01), ulna and digit two lengths. Females showed a higher relative ulna length and shorter relative digit two length, and the opposite occurred in males (P ≤ 0.01). Epiphyseal fusion pattern proved to be a tool to determine dolphin's age; franciscana dolphins with a mature flipper were, at least, four years old. This study indicates that the flippers of franciscana dolphins are symmetrical; both sexes show a negative allometric trend; SD is observed in radius, ulna, and digit two; and flipper skeleton allows determine the age class of the dolphins.

  1. Analysis and Design of International Emission Trading Markets Applying System Dynamics Techniques

    Science.gov (United States)

    Hu, Bo; Pickl, Stefan

    2010-11-01

    The design and analysis of international emission trading markets is an important actual challenge. Time-discrete models are needed to understand and optimize these procedures. We give an introduction into this scientific area and present actual modeling approaches. Furthermore, we develop a model which is embedded in a holistic problem solution. Measures for energy efficiency are characterized. The economic time-discrete "cap-and-trade" mechanism is influenced by various underlying anticipatory effects. With a systematic dynamic approach the effects can be examined. First numerical results show that fair international emissions trading can only be conducted with the use of protective export duties. Furthermore a comparatively high price which evokes emission reduction inevitably has an inhibiting effect on economic growth according to our model. As it always has been expected it is not without difficulty to find a balance between economic growth and emission reduction. It can be anticipated using our System Dynamics model simulation that substantial changes must be taken place before international emissions trading markets can contribute to global GHG emissions mitigation.

  2. A comparison of quantitative reconstruction techniques for PIXE-tomography analysis applied to biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Beasley, D.G., E-mail: dgbeasley@ctn.ist.utl.pt [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Alves, L.C. [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Barberet, Ph.; Bourret, S.; Devès, G.; Gordillo, N.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Le Trequesser, Q. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Institut de Chimie de la Matière Condensée de Bordeaux (ICMCB, UPR9048) CNRS, Université de Bordeaux, 87 avenue du Dr. A. Schweitzer, Pessac F-33608 (France); Marques, A.C. [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Silva, R.C. da [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal)

    2014-07-15

    The tomographic reconstruction of biological specimens requires robust algorithms, able to deal with low density contrast and low element concentrations. At the IST/ITN microprobe facility new GPU-accelerated reconstruction software, JPIXET, has been developed, which can significantly increase the speed of quantitative reconstruction of Proton Induced X-ray Emission Tomography (PIXE-T) data. It has a user-friendly graphical user interface for pre-processing, data analysis and reconstruction of PIXE-T and Scanning Transmission Ion Microscopy Tomography (STIM-T). The reconstruction of PIXE-T data is performed using either an algorithm based on a GPU-accelerated version of the Maximum Likelihood Expectation Maximisation (MLEM) method or a GPU-accelerated version of the Discrete Image Space Reconstruction Algorithm (DISRA) (Sakellariou (2001) [2]). The original DISRA, its accelerated version, and the MLEM algorithm, were compared for the reconstruction of a biological sample of Caenorhabditis elegans – a small worm. This sample was analysed at the microbeam line of the AIFIRA facility of CENBG, Bordeaux. A qualitative PIXE-T reconstruction was obtained using the CENBG software package TomoRebuild (Habchi et al. (2013) [6]). The effects of pre-processing and experimental conditions on the elemental concentrations are discussed.

  3. Analysis of Arbitrary Reflector Antennas Applying the Geometrical Theory of Diffraction Together with the Master Points Technique

    Directory of Open Access Journals (Sweden)

    María Jesús Algar

    2013-01-01

    Full Text Available An efficient approach for the analysis of surface conformed reflector antennas fed arbitrarily is presented. The near field in a large number of sampling points in the aperture of the reflector is obtained applying the Geometrical Theory of Diffraction (GTD. A new technique named Master Points has been developed to reduce the complexity of the ray-tracing computations. The combination of both GTD and Master Points reduces the time requirements of this kind of analysis. To validate the new approach, several reflectors and the effects on the radiation pattern caused by shifting the feed and introducing different obstacles have been considered concerning both simple and complex geometries. The results of these analyses have been compared with the Method of Moments (MoM results.

  4. Multivariate class modeling techniques applied to multielement analysis for the verification of the geographical origin of chili pepper.

    Science.gov (United States)

    Naccarato, Attilio; Furia, Emilia; Sindona, Giovanni; Tagarelli, Antonio

    2016-09-01

    Four class-modeling techniques (soft independent modeling of class analogy (SIMCA), unequal dispersed classes (UNEQ), potential functions (PF), and multivariate range modeling (MRM)) were applied to multielement distribution to build chemometric models able to authenticate chili pepper samples grown in Calabria respect to those grown outside of Calabria. The multivariate techniques were applied by considering both all the variables (32 elements, Al, As, Ba, Ca, Cd, Ce, Co, Cr, Cs, Cu, Dy, Fe, Ga, La, Li, Mg, Mn, Na, Nd, Ni, Pb, Pr, Rb, Sc, Se, Sr, Tl, Tm, V, Y, Yb, Zn) and variables selected by means of stepwise linear discriminant analysis (S-LDA). In the first case, satisfactory and comparable results in terms of CV efficiency are obtained with the use of SIMCA and MRM (82.3 and 83.2% respectively), whereas MRM performs better than SIMCA in terms of forced model efficiency (96.5%). The selection of variables by S-LDA permitted to build models characterized, in general, by a higher efficiency. MRM provided again the best results for CV efficiency (87.7% with an effective balance of sensitivity and specificity) as well as forced model efficiency (96.5%).

  5. Computational optimization techniques applied to microgrids planning

    DEFF Research Database (Denmark)

    Gamarra, Carlos; Guerrero, Josep M.

    2015-01-01

    appear along the planning process. In this context, technical literature about optimization techniques applied to microgrid planning have been reviewed and the guidelines for innovative planning methodologies focused on economic feasibility can be defined. Finally, some trending techniques and new...

  6. Judging complex movement performances for excellence: a principal components analysis-based technique applied to competitive diving.

    Science.gov (United States)

    Young, Cole; Reinkensmeyer, David J

    2014-08-01

    Athletes rely on subjective assessment of complex movements from coaches and judges to improve their motor skills. In some sports, such as diving, snowboard half pipe, gymnastics, and figure skating, subjective scoring forms the basis for competition. It is currently unclear whether this scoring process can be mathematically modeled; doing so could provide insight into what motor skill is. Principal components analysis has been proposed as a motion analysis method for identifying fundamental units of coordination. We used PCA to analyze movement quality of dives taken from USA Diving's 2009 World Team Selection Camp, first identifying eigenpostures associated with dives, and then using the eigenpostures and their temporal weighting coefficients, as well as elements commonly assumed to affect scoring - gross body path, splash area, and board tip motion - to identify eigendives. Within this eigendive space we predicted actual judges' scores using linear regression. This technique rated dives with accuracy comparable to the human judges. The temporal weighting of the eigenpostures, body center path, splash area, and board tip motion affected the score, but not the eigenpostures themselves. These results illustrate that (1) subjective scoring in a competitive diving event can be mathematically modeled; (2) the elements commonly assumed to affect dive scoring actually do affect scoring (3) skill in elite diving is more associated with the gross body path and the effect of the movement on the board and water than the units of coordination that PCA extracts, which might reflect the high level of technique these divers had achieved. We also illustrate how eigendives can be used to produce dive animations that an observer can distort continuously from poor to excellent, which is a novel approach to performance visualization.

  7. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  8. Trends in non-stationary signal processing techniques applied to vibration analysis of wind turbine drive train - A contemporary survey

    Science.gov (United States)

    Uma Maheswari, R.; Umamaheswari, R.

    2017-02-01

    Condition Monitoring System (CMS) substantiates potential economic benefits and enables prognostic maintenance in wind turbine-generator failure prevention. Vibration Monitoring and Analysis is a powerful tool in drive train CMS, which enables the early detection of impending failure/damage. In variable speed drives such as wind turbine-generator drive trains, the vibration signal acquired is of non-stationary and non-linear. The traditional stationary signal processing techniques are inefficient to diagnose the machine faults in time varying conditions. The current research trend in CMS for drive-train focuses on developing/improving non-linear, non-stationary feature extraction and fault classification algorithms to improve fault detection/prediction sensitivity and selectivity and thereby reducing the misdetection and false alarm rates. In literature, review of stationary signal processing algorithms employed in vibration analysis is done at great extent. In this paper, an attempt is made to review the recent research advances in non-linear non-stationary signal processing algorithms particularly suited for variable speed wind turbines.

  9. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  10. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  11. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  12. Analysis of the Galvanostatic Intermittent Titration Technique (GITT) as applied to a lithium-ion porous electrode

    Science.gov (United States)

    Dees, Dennis W.; Kawauchi, Shigehiro; Abraham, Daniel P.; Prakash, Jai

    Galvanostatic Intermittent Titration Technique (GITT) experiments were conducted to determine the lithium diffusion coefficient of LiNi 0.8Co 0.15Al 0.05O 2, used as the active material in a lithium-ion battery porous composite positive electrode. An electrochemical model, based on concentrated solution porous electrode theory, was developed to analyze the GITT experimental results and compare to the original GITT analytical theory. The GITT experimental studies on the oxide active material were conducted between 3.5 and 4.5 V vs. lithium, with the maximum lithium diffusion coefficient value being 10 -10 cm 2 s -1 at 3.85 V. The lithium diffusion coefficient values obtained from this study agree favorably with the values obtained from an earlier electrochemical impedance spectroscopy study.

  13. Ion-selective electrodes for potentiometric determination of ranitidine hydrochloride, applying batch and flow injection analysis techniques.

    Science.gov (United States)

    Issa, Yousry M; Badawy, Sayed S; Mutair, Ali A

    2005-12-01

    New ranitidine hydrochloride (RaCl)-selective electrodes of the conventional polymer membrane type are described. They are based on incorporation of ranitidine-tetraphenylborate (Ra-TPB) ion-pair or ranitidine-phosphotungstate (RaPT) ion-associate in a poly(vinyl chloride) (PVC) membrane plasticized with dioctylphthalate (DOP) or dibutylphthalate (DBP). The electrodes are fully characterized in terms of the membrane composition, solution temperature, and pH. The sensors showed fast and stable responses. Nernstian response was found over the concentration range of 2.0 x 10(-5) M to 1.0 x 10(-2) M and 1.0 x 10(-5) M to 1.0 x 10(-2) M in the case of Ra-TPB electrode and over the range of 1.03 x 10(-5) M to 1.00 x 10(-2) M and 1.0 x 10(-5) M to 1.0 x 10(-2) M in the case of Ra-PT electrode for batch and FIA systems, respectively. The electrodes exhibit good selectivity for RaCl with respect to a large number of common ions, sugars, amino acids, and components other than ranitidine hydrochloride of the investigated mixed drugs. The electrodes have been applied to the potentiometric determination of RaCl in pure solutions and in pharmaceutical preparations under batch and flow injection conditions with a lower detection limit of 1.26 x 10(-5) M and 5.62 x 10(-6) M at 25 +/- 1 degrees C. An average recovery of 100.91% and 100.42% with a relative standard deviation of 0.72% and 0.53% has been achieved.

  14. Applying DEA Technique to Library Evaluation in Academic Research Libraries.

    Science.gov (United States)

    Shim, Wonsik

    2003-01-01

    This study applied an analytical technique called Data Envelopment Analysis (DEA) to calculate the relative technical efficiency of 95 academic research libraries, all members of the Association of Research Libraries. DEA, with the proper model of library inputs and outputs, can reveal best practices in the peer groups, as well as the technical…

  15. Influence of elemental concentration in soil on vegetables applying analytical nuclear techniques: k{sub 0}-instrumental neutron activation analysis and radiometry

    Energy Technology Data Exchange (ETDEWEB)

    Menezes, Maria Angela de B.C. [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil). Servico de Reator e Irradiacao]. E-mail: menezes@cdtn.br; Mingote, Raquel Maia [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil). Servico de Quimica e Radioquimica; Silva, Lucilene Guerra e; Pedrosa, Lorena Gomes [Minas Gerais Univ., Belo Horizonte, MG (Brazil). Faculdade de Farmacia

    2005-07-01

    Samples from two vegetable gardens where analysed aiming at determining the elemental concentration. The vegetables selected to be studied are grown by the people for their own use and are present in daily meal. One vegetable garden studied is close to a mining activity in a region inserted in the Iron Quadrangle (Quadrilatero Ferrifero), located in the Brazilian state of Minas Gerais. This region is considered one of the richest mineral bearing regions in the world. Another vegetable garden studied is far from this region and without any mining activity It was also studied as a comparative site. This assessment was carried out to evaluate the elemental concentration in soil and vegetables, matrixes connected with the chain food, applying the k{sub 0}-Instrumental Neutron Activation Analysis (k{sub 0}-INAA) at the Laboratory for Neutron Activation Analysis. However, this work reports only the results of thorium, uranium and rare-earth obtained in samples collected during the dry season, focusing on the influence of these elements on vegetable elemental composition. Results of natural radioactivity determined by Gross Alpha and Gross Beta measurements, are also reported. This study is related to the BRA 11920 project, entitled 'Iron Quadrangle, Brazil: assessment of health impact caused by mining pollutants through chain food applying nuclear and related techniques', one of the researches co-ordinated by the IAEA (Vienna, Austria). (author)

  16. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  17. Applied time series analysis

    CERN Document Server

    Woodward, Wayne A; Elliott, Alan C

    2011-01-01

    ""There is scarcely a standard technique that the reader will find left out … this book is highly recommended for those requiring a ready introduction to applicable methods in time series and serves as a useful resource for pedagogical purposes.""-International Statistical Review (2014), 82""Current time series theory for practice is well summarized in this book.""-Emmanuel Parzen, Texas A&M University""What an extraordinary range of topics covered, all very insightfully. I like [the authors'] innovations very much, such as the AR factor table.""-David Findley, U.S. Census Bureau (retired)""…

  18. Assessment of Coastal and Urban Flooding Hazards Applying Extreme Value Analysis and Multivariate Statistical Techniques: A Case Study in Elwood, Australia

    Science.gov (United States)

    Guimarães Nobre, Gabriela; Arnbjerg-Nielsen, Karsten; Rosbjerg, Dan; Madsen, Henrik

    2016-04-01

    Traditionally, flood risk assessment studies have been carried out from a univariate frequency analysis perspective. However, statistical dependence between hydrological variables, such as extreme rainfall and extreme sea surge, is plausible to exist, since both variables to some extent are driven by common meteorological conditions. Aiming to overcome this limitation, multivariate statistical techniques has the potential to combine different sources of flooding in the investigation. The aim of this study was to apply a range of statistical methodologies for analyzing combined extreme hydrological variables that can lead to coastal and urban flooding. The study area is the Elwood Catchment, which is a highly urbanized catchment located in the city of Port Phillip, Melbourne, Australia. The first part of the investigation dealt with the marginal extreme value distributions. Two approaches to extract extreme value series were applied (Annual Maximum and Partial Duration Series), and different probability distribution functions were fit to the observed sample. Results obtained by using the Generalized Pareto distribution demonstrate the ability of the Pareto family to model the extreme events. Advancing into multivariate extreme value analysis, first an investigation regarding the asymptotic properties of extremal dependence was carried out. As a weak positive asymptotic dependence between the bivariate extreme pairs was found, the Conditional method proposed by Heffernan and Tawn (2004) was chosen. This approach is suitable to model bivariate extreme values, which are relatively unlikely to occur together. The results show that the probability of an extreme sea surge occurring during a one-hour intensity extreme precipitation event (or vice versa) can be twice as great as what would occur when assuming independent events. Therefore, presuming independence between these two variables would result in severe underestimation of the flooding risk in the study area.

  19. Nuclear radioactive techniques applied to materials research

    CERN Document Server

    Correia, João Guilherme; Wahl, Ulrich

    2011-01-01

    In this paper we review materials characterization techniques using radioactive isotopes at the ISOLDE/CERN facility. At ISOLDE intense beams of chemically clean radioactive isotopes are provided by selective ion-sources and high-resolution isotope separators, which are coupled on-line with particle accelerators. There, new experiments are performed by an increasing number of materials researchers, which use nuclear spectroscopic techniques such as Mössbauer, Perturbed Angular Correlations (PAC), beta-NMR and Emission Channeling with short-lived isotopes not available elsewhere. Additionally, diffusion studies and traditionally non-radioactive techniques as Deep Level Transient Spectroscopy, Hall effect and Photoluminescence measurements are performed on radioactive doped samples, providing in this way the element signature upon correlation of the time dependence of the signal with the isotope transmutation half-life. Current developments, applications and perspectives of using radioactive ion beams and tech...

  20. Applying Mixed Methods Techniques in Strategic Planning

    Science.gov (United States)

    Voorhees, Richard A.

    2008-01-01

    In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…

  1. Digital Speckle Technique Applied to Flow Visualization

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Digital speckle technique uses a laser, a CCD camera, and digital processing to generate interference fringes at the television framing rate. Its most obvious advantage is that neither darkroom facilities nor photographic wet chemical processing is required. In addition, it can be used in harsh engineering environments. This paper discusses the strengths and weaknesses of three digital speckle methodologies. (1) Digital speckle pattern interferometry (DSPI) uses an optical polarization phase shifter for visualization and measurement of the density field in a flow field. (2) Digital shearing speckle interferometry (DSSI) utilizes speckle-shearing interferometry in addition to optical polarization phase shifting. (3) Digital speckle photography (DSP) with computer reconstruction. The discussion describes the concepts, the principles and the experimental arrangements with some experimental results. The investigation shows that these three digital speckle techniques provide an excellent method for visualizing flow fields and for measuring density distributions in fluid mechanics and thermal flows.

  2. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  3. Correspondence Analysis applied to psychological research

    OpenAIRE

    Laura Doey; Jessica Kurta

    2011-01-01

    Correspondence analysis is an exploratory data technique used to analyze categorical data (Benzecri, 1992). It is used in many areas such as marketing and ecology. Correspondence analysis has been used less often in psychological research, although it can be suitably applied. This article discusses the benefits of using correspondence analysis in psychological research and provides a tutorial on how to perform correspondence analysis using the Statistical Package for the Social Sciences (SPSS).

  4. Correspondence Analysis applied to psychological research

    Directory of Open Access Journals (Sweden)

    Laura Doey

    2011-04-01

    Full Text Available Correspondence analysis is an exploratory data technique used to analyze categorical data (Benzecri, 1992. It is used in many areas such as marketing and ecology. Correspondence analysis has been used less often in psychological research, although it can be suitably applied. This article discusses the benefits of using correspondence analysis in psychological research and provides a tutorial on how to perform correspondence analysis using the Statistical Package for the Social Sciences (SPSS.

  5. Applying Cooperative Techniques in Teaching Problem Solving

    Directory of Open Access Journals (Sweden)

    Krisztina Barczi

    2013-12-01

    Full Text Available Teaching how to solve problems – from solving simple equations to solving difficult competition tasks – has been one of the greatest challenges for mathematics education for many years. Trying to find an effective method is an important educational task. Among others, the question arises as to whether a method in which students help each other might be useful. The present article describes part of an experiment that was designed to determine the effects of cooperative teaching techniques on the development of problem-solving skills.

  6. Technique applied in electrical power distribution for Satellite Launch Vehicle

    Directory of Open Access Journals (Sweden)

    João Maurício Rosário

    2010-09-01

    Full Text Available The Satellite Launch Vehicle electrical network, which is currently being developed in Brazil, is sub-divided for analysis in the following parts: Service Electrical Network, Controlling Electrical Network, Safety Electrical Network and Telemetry Electrical Network. During the pre-launching and launching phases, these electrical networks are associated electrically and mechanically to the structure of the vehicle. In order to succeed in the integration of these electrical networks it is necessary to employ techniques of electrical power distribution, which are proper to Launch Vehicle systems. This work presents the most important techniques to be considered in the characterization of the electrical power supply applied to Launch Vehicle systems. Such techniques are primarily designed to allow the electrical networks, when submitted to the single-phase fault to ground, to be able of keeping the power supply to the loads.

  7. Autoadjustable sutures and modified seldinger technique applied to laparoscopic jejunostomy.

    Science.gov (United States)

    Pili, Diego; Ciotola, Franco; Riganti, Juan Martín; Badaloni, Adolfo; Nieponice, Alejandro

    2015-02-01

    This is a simple technique to be applied to those patients requiring an alternative feeding method. This technique has been successfully applied to 25 patients suffering from esophageal carcinoma. The procedure involves laparoscopic approach, suture of the selected intestinal loop to the abdominal wall and jejunostomy using Seldinger technique and autoadjustable sutures. No morbidity or mortality was reported.

  8. PIXE technique applied to Almeida Junior materials

    Energy Technology Data Exchange (ETDEWEB)

    Pascholati, Paulo R.; Rizzutto, Marcia A.; Neves, Graziela; Tabacniks, Manfredo H.; Moleiro, Guilherme F.; Dias, Flavia A. [Universidade de Sao Paulo (USP), SP (Brazil). Inst. de Fisica]. E-mails: paschola@if.usp.br; rizzutto@if.usp.br; graziela@if.usp.br; tabacniks@if.usp.br; guimol@if.usp.br; fladias@if.usp.br; Mendonca, Valeria de [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Pinacoteca do Estado de Sao Paulo, Sao Paulo, SP (Brazil); E-mail: vmendonca@pinacoteca.org.br

    2007-07-01

    The Institute of Physics University of Sao Paulo in collaboration with the Pinacoteca do Estado of the State of Sao Paulo has a project to develop a data bank with information about the elementary composition of pigments of paintings and materials of its collection for future application as conservation and restoration as well as authenticity,. The project is beginning with the materials (palette, paint box and paint tubes) belonging to the painter Almeida Junior. Twenty-three spots on the palette were chosen with determined colors, and also the paint tubes present in the paint box. The PIXE (Particle Induced X-ray Emission) analysis of the spectra enabled to conclude that the red colors have predominant Hg and S suggesting Vermellion and the white one are consisted of Pb (Lead White). The analyzed tubes of same colors confirm the elements pigment present in the palette. (author)

  9. Data Mining Techniques Applied to Hydrogen Lactose Breath Test

    Science.gov (United States)

    Nepomuceno-Chamorro, Isabel; Pontes-Balanza, Beatriz; Hernández-Mendoza, Yoedusvany; Rodríguez-Herrera, Alfonso

    2017-01-01

    In this work, we present the results of applying data mining techniques to hydrogen breath test data. Disposal of H2 gas is of utmost relevance to maintain efficient microbial fermentation processes. Objectives Analyze a set of data of hydrogen breath tests by use of data mining tools. Identify new patterns of H2 production. Methods Hydrogen breath tests data sets as well as k-means clustering as the data mining technique to a dataset of 2571 patients. Results Six different patterns have been extracted upon analysis of the hydrogen breath test data. We have also shown the relevance of each of the samples taken throughout the test. Conclusions Analysis of the hydrogen breath test data sets using data mining techniques has identified new patterns of hydrogen generation upon lactose absorption. We can see the potential of application of data mining techniques to clinical data sets. These results offer promising data for future research on the relations between gut microbiota produced hydrogen and its link to clinical symptoms. PMID:28125620

  10. Conversation Analysis in Applied Linguistics

    DEFF Research Database (Denmark)

    Kasper, Gabriele; Wagner, Johannes

    2014-01-01

    with understanding fundamental issues of talk in action and of intersubjectivity in human conduct. The field has expanded its scope from the analysis of talk—often phone calls—towards an integration of language with other semiotic resources for embodied action, including space and objects. Much of this expansion has...... been driven by applied work. After laying out CA's standard practices of data treatment and analysis, this article takes up the role of comparison as a fundamental analytical strategy and reviews recent developments into cross-linguistic and cross-cultural directions. The remaining article focuses......For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...

  11. Applied analysis and differential equations

    CERN Document Server

    Cârj, Ovidiu

    2007-01-01

    This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.

  12. Analytical techniques applied to study cultural heritage objects

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N., E-mail: rizzutto@if.usp.br [Universidade de Sao Paulo (USP), SP (Brazil). Instituto de Fisica

    2015-07-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  13. Tensometry technique for X-ray diffraction in applied analysis of welding; Tensometria por tecnica de difracao de raios X aplicada na analise de soldagens

    Energy Technology Data Exchange (ETDEWEB)

    Turibus, S.N.; Caldas, F.C.M.; Miranda, D.M.; Monine, V.I.; Assis, J.T., E-mail: snturibus@iprj.uerj.b [Universidade do Estado do Rio de Janeiro (IPRJ/UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico

    2010-07-01

    This paper presents the analysis of residual stress introduced in welding process. As the stress in a material can induce damages, it is necessary to have a method to identify this residual stress state. For this it was used the non-destructive X-ray diffraction technique to analyze two plates from A36 steel jointed by metal inert gas (MIG) welding. The stress measurements were made by the sin{sup 2{psi}} method in weld region of steel plates including analysis of longitudinal and transverse residual stresses in fusion zone, heat affected zone (HAZ) and base metal. To determine the stress distribution along the depth of the welded material it was used removing of superficial layers made by electropolishing. (author)

  14. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  15. A New Experimental Technique for Applying Impulse Tension Loading

    OpenAIRE

    Fan, Z. S.; Yu, H. P.; Su, H; Zhang, X.; Li, C. F.

    2016-01-01

    This paper deals with a new experimental technique for applying impulse tension loads. Briefly, the technique is based on the use of pulsed-magnetic-driven tension loading. Electromagnetic forming (EMF) can be quite effective in increasing the forming limits of metal sheets, such as aluminium and magnesium alloys. Yet, why the forming limit is increased is still an open question. One reason for this is the difficulty to let forming proceed on a certain influence monotonically: ...

  16. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  17. Conversation Analysis and Applied Linguistics.

    Science.gov (United States)

    Schegloff, Emanuel A.; Koshik, Irene; Jacoby, Sally; Olsher, David

    2002-01-01

    Offers biographical guidance on several major areas of conversation-analytic work--turn-taking, repair, and word selection--and indicates past or potential points of contact with applied linguistics. Also discusses areas of applied linguistic work. (Author/VWL)

  18. Sneak analysis applied to process systems

    Science.gov (United States)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  19. Sensor Data Qualification Technique Applied to Gas Turbine Engines

    Science.gov (United States)

    Csank, Jeffrey T.; Simon, Donald L.

    2013-01-01

    This paper applies a previously developed sensor data qualification technique to a commercial aircraft engine simulation known as the Commercial Modular Aero-Propulsion System Simulation 40,000 (C-MAPSS40k). The sensor data qualification technique is designed to detect, isolate, and accommodate faulty sensor measurements. It features sensor networks, which group various sensors together and relies on an empirically derived analytical model to relate the sensor measurements. Relationships between all member sensors of the network are analyzed to detect and isolate any faulty sensor within the network.

  20. Applying recursive numerical integration techniques for solving high dimensional integrals

    CERN Document Server

    Ammon, Andreas; Hartung, Tobias; Jansen, Karl; Leövey, Hernan; Volmer, Julia

    2016-01-01

    The error scaling for Markov-Chain Monte Carlo techniques (MCMC) with $N$ samples behaves like $1/\\sqrt{N}$. This scaling makes it often very time intensive to reduce the error of computed observables, in particular for applications in lattice QCD. It is therefore highly desirable to have alternative methods at hand which show an improved error scaling. One candidate for such an alternative integration technique is the method of recursive numerical integration (RNI). The basic idea of this method is to use an efficient low-dimensional quadrature rule (usually of Gaussian type) and apply it iteratively to integrate over high-dimensional observables and Boltzmann weights. We present the application of such an algorithm to the topological rotor and the anharmonic oscillator and compare the error scaling to MCMC results. In particular, we demonstrate that the RNI technique shows an error scaling in the number of integration points $m$ that is at least exponential.

  1. Volcanic Monitoring Techniques Applied to Controlled Fragmentation Experiments

    Science.gov (United States)

    Kueppers, U.; Alatorre-Ibarguengoitia, M. A.; Hort, M. K.; Kremers, S.; Meier, K.; Scharff, L.; Scheu, B.; Taddeucci, J.; Dingwell, D. B.

    2010-12-01

    ejection and that the evaluated results were mostly in good agreement. We will discuss the technical difficulties encountered, e.g. the temporal synchronisation of the different techniques. Furthermore, the internal data management of the DR prevents at present a continuous recording and only a limited number of snapshots is stored. Nonetheless, in at least three experiments the onset of particle ejection was measured by all different techniques and gave coherent results of up to 100 m/s. This is a very encouraging result and of paramount importance as it proofs the applicability of these independent methods to volcano monitoring. Each method by itself may enhance our understanding of the pressurisation state of a volcano, an essential factor in ballistic hazard evaluation and eruption energy estimation. Technical adaptations of the DR will overcome the encountered problems and allow a more refined data analysis during the next campaign.

  2. Evaluation via multivariate techniques of scale factor variability in the rietveld method applied to quantitative phase analysis with X ray powder diffraction

    Directory of Open Access Journals (Sweden)

    Terezinha Ferreira de Oliveira

    2006-12-01

    Full Text Available The present work uses multivariate statistical analysis as a form of establishing the main sources of error in the Quantitative Phase Analysis (QPA using the Rietveld method. The quantitative determination of crystalline phases using x ray powder diffraction is a complex measurement process whose results are influenced by several factors. Ternary mixtures of Al2O3, MgO and NiO were prepared under controlled conditions and the diffractions were obtained using the Bragg-Brentano geometric arrangement. It was possible to establish four sources of critical variations: the experimental absorption and the scale factor of NiO, which is the phase with the greatest linear absorption coefficient of the ternary mixture; the instrumental characteristics represented by mechanical errors of the goniometer and sample displacement; the other two phases (Al2O3 and MgO; and the temperature and relative humidity of the air in the laboratory. The error sources excessively impair the QPA with the Rietveld method. Therefore it becomes necessary to control them during the measurement procedure.

  3. Applying Supervised Opinion Mining Techniques on Online User Reviews

    Directory of Open Access Journals (Sweden)

    Ion SMEUREANU

    2012-01-01

    Full Text Available In recent years, the spectacular development of web technologies, lead to an enormous quantity of user generated information in online systems. This large amount of information on web platforms make them viable for use as data sources, in applications based on opinion mining and sentiment analysis. The paper proposes an algorithm for detecting sentiments on movie user reviews, based on naive Bayes classifier. We make an analysis of the opinion mining domain, techniques used in sentiment analysis and its applicability. We implemented the proposed algorithm and we tested its performance, and suggested directions of development.

  4. Bioremediation techniques applied to aqueous media contaminated with mercury.

    Science.gov (United States)

    Velásquez-Riaño, Möritz; Benavides-Otaya, Holman D

    2016-12-01

    In recent years, the environmental and human health impacts of mercury contamination have driven the search for alternative, eco-efficient techniques different from the traditional physicochemical methods for treating this metal. One of these alternative processes is bioremediation. A comprehensive analysis of the different variables that can affect this process is presented. It focuses on determining the effectiveness of different techniques of bioremediation, with a specific consideration of three variables: the removal percentage, time needed for bioremediation and initial concentration of mercury to be treated in an aqueous medium.

  5. Manifold learning techniques and model reduction applied to dissipative PDEs

    CERN Document Server

    Sonday, Benjamin E; Gear, C William; Kevrekidis, Ioannis G

    2010-01-01

    We link nonlinear manifold learning techniques for data analysis/compression with model reduction techniques for evolution equations with time scale separation. In particular, we demonstrate a `"nonlinear extension" of the POD-Galerkin approach to obtaining reduced dynamic models of dissipative evolution equations. The approach is illustrated through a reaction-diffusion PDE, and the performance of different simulators on the full and the reduced models is compared. We also discuss the relation of this nonlinear extension with the so-called "nonlinear Galerkin" methods developed in the context of Approximate Inertial Manifolds.

  6. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  7. Essentials of applied dynamic analysis

    CERN Document Server

    Jia, Junbo

    2014-01-01

    This book presents up-to-date knowledge of dynamic analysis in engineering world. To facilitate the understanding of the topics by readers with various backgrounds, general principles are linked to their applications from different angles. Special interesting topics such as statistics of motions and loading, damping modeling and measurement, nonlinear dynamics, fatigue assessment, vibration and buckling under axial loading, structural health monitoring, human body vibrations, and vehicle-structure interactions etc., are also presented. The target readers include industry professionals in civil, marine and mechanical engineering, as well as researchers and students in this area.

  8. Signal processing techniques applied to a small circular seismic array

    Science.gov (United States)

    Mosher, C. C.

    1980-03-01

    The travel time method (TTM) for locating earthquakes and the wavefront curvature method (WCM), which determines distance to an event by measuring the curvature of the wavefront can be combined in a procedure referred to as Apparent Velocity Mapping (AVM). Apparent velocities for mine blasts and local earthquakes computed by the WCM are inverted for a velocity structure. The velocity structure is used in the TTM to relocate events. Model studies indicate that AVM can adequately resolve the velocity structure for the case of linear velocity-depth gradient. Surface waves from mine blasts recorded by the Central Minnesota Seismic Array were analyzed using a modification of the multiple filter analysis (MFA) technique to determine group arrival times at several stations of an array. The advantages of array MFA are that source location need not be known, lateral refraction can be detected and removed, and multiple arrivals can be separated. A modeling procedure that can be used with array MFA is described.

  9. Free Radical Imaging Techniques Applied to Hydrocarbon Flames Diagnosis

    Institute of Scientific and Technical Information of China (English)

    A. Caldeira-Pires

    2001-01-01

    This paper evaluates the utilization of free radical chemiluminescence imaging and tomographic reconstruction techniques to assess advanced information on reacting flows. Two different laboratory flow configurations were analyzed, including unconfined non-premixed jet flame measurements to evaluate flame fuel/air mixing patterns at the burner-port of a typical glass-furnace burner. The second case characterized the reaction zone of premixed flames within gas turbine combustion chambers, based on a laboratory scale model of a lean prevaporized premixed (LPP) combustion chamber.The analysis shows that advanced imaging diagnosis can provide new information on the characterization of flame mixing and reacting phenomena. The utilization of local C2 and CH chemiluminescence can assess useful information on the quality of the combustion process, which can be used to improve the design of practical combustors.

  10. Modern Techniques and Technologies Applied to Training and Performance Monitoring.

    Science.gov (United States)

    Sands, William A; Kavanaugh, Ashley A; Murray, Steven R; McNeal, Jeni R; Jemni, Monèm

    2016-12-05

    Athlete preparation and performance continues to increase in complexity and costs. Modern coaches are shifting from reliance on personal memory, experience, and opinion to evidence from collected training load data. Training load monitoring may hold vital information for developing systems of monitoring that follow the training process with such precision that both performance prediction and day-to-day management of training become an adjunct to preparation and performance. Time series data collection and analyses in sport are still in their infancy with considerable efforts being applied in "big-data" analytics and models of the appropriate variables to monitor and methods for doing so. Training monitoring has already garnered important applications, but lacks a theoretical framework from which to develop further. As such, we propose a framework involving the following: analyses of individuals, trend analyses, rules-based analysis, and statistical process control.

  11. 金属元素分析技术在酒类产品品质分析与鉴定中的应用%The Metallic Element Analysis Technique Applied to Alcoholic Products Quality Analysis and Identification

    Institute of Scientific and Technical Information of China (English)

    陈梦琪

    2015-01-01

    酒中存在多种金属元素,重金属的检测技术具有简捷、快速、准确、性价比高的特点,被广泛应用于酒类产品理化分析和品质鉴定.%Because of various metal content in alcohol ,the heavy metal detection technology with simple, rapid, accurate and cost-effective features, are widely applied to quality identification and physical and chemical analysis of alcohol products.

  12. Concept analysis of culture applied to nursing.

    Science.gov (United States)

    Marzilli, Colleen

    2014-01-01

    Culture is an important concept, especially when applied to nursing. A concept analysis of culture is essential to understanding the meaning of the word. This article applies Rodgers' (2000) concept analysis template and provides a definition of the word culture as it applies to nursing practice. This article supplies examples of the concept of culture to aid the reader in understanding its application to nursing and includes a case study demonstrating components of culture that must be respected and included when providing health care.

  13. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  14. Semantic Data And Visualization Techniques Applied To Geologic Field Mapping

    Science.gov (United States)

    Houser, P. I. Q.; Royo-Leon, M.; Munoz, R.; Estrada, E.; Villanueva-Rosales, N.; Pennington, D. D.

    2015-12-01

    Geologic field mapping involves the use of technology before, during, and after visiting a site. Geologists utilize hardware such as Global Positioning Systems (GPS) connected to mobile computing platforms such as tablets that include software such as ESRI's ArcPad and other software to produce maps and figures for a final analysis and report. Hand written field notes contain important information and drawings or sketches of specific areas within the field study. Our goal is to collect and geo-tag final and raw field data into a cyber-infrastructure environment with an ontology that allows for large data processing, visualization, sharing, and searching, aiding in connecting field research with prior research in the same area and/or aid with experiment replication. Online searches of a specific field area return results such as weather data from NOAA and QuakeML seismic data from USGS. These results that can then be saved to a field mobile device and searched while in the field where there is no Internet connection. To accomplish this we created the GeoField ontology service using the Web Ontology Language (OWL) and Protégé software. Advanced queries on the dataset can be made using reasoning capabilities can be supported that go beyond a standard database service. These improvements include the automated discovery of data relevant to a specific field site and visualization techniques aimed at enhancing analysis and collaboration while in the field by draping data over mobile views of the site using augmented reality. A case study is being performed at University of Texas at El Paso's Indio Mountains Research Station located near Van Horn, Texas, an active multi-disciplinary field study site. The user can interactively move the camera around the study site and view their data digitally. Geologist's can check their data against the site in real-time and improve collaboration with another person as both parties have the same interactive view of the data.

  15. Digital prototyping technique applied for redesigning plastic products

    Science.gov (United States)

    Pop, A.; Andrei, A.

    2015-11-01

    After products are on the market for some time, they often need to be redesigned to meet new market requirements. New products are generally derived from similar but outdated products. Redesigning a product is an important part of the production and development process. The purpose of this paper is to show that using modern technology, like Digital Prototyping in industry is an effective way to produce new products. This paper tries to demonstrate and highlight the effectiveness of the concept of Digital Prototyping, both to reduce the design time of a new product, but also the costs required for implementing this step. The results of this paper show that using Digital Prototyping techniques in designing a new product from an existing one available on the market mould offers a significantly manufacturing time and cost reduction. The ability to simulate and test a new product with modern CAD-CAM programs in all aspects of production (designing of the 3D model, simulation of the structural resistance, analysis of the injection process and beautification) offers a helpful tool for engineers. The whole process can be realised by one skilled engineer very fast and effective.

  16. Remote sensing techniques applied to seismic vulnerability assessment

    Science.gov (United States)

    Juan Arranz, Jose; Torres, Yolanda; Hahgi, Azade; Gaspar-Escribano, Jorge

    2016-04-01

    Advances in remote sensing and photogrammetry techniques have increased the degree of accuracy and resolution in the record of the earth's surface. This has expanded the range of possible applications of these data. In this research, we have used these data to document the construction characteristics of the urban environment of Lorca, Spain. An exposure database has been created with the gathered information to be used in seismic vulnerability assessment. To this end, we have used data from photogrammetric flights at different periods, using both orthorectified images in the visible and infrared spectrum. Furthermore, the analysis is completed using LiDAR data. From the combination of these data, it has been possible to delineate the building footprints and characterize the constructions with attributes such as the approximate date of construction, area, type of roof and even building materials. To carry out the calculation, we have developed different algorithms to compare images from different times, segment images, classify LiDAR data, and use the infrared data in order to remove vegetation or to compute roof surfaces with height value, tilt and spectral fingerprint. In addition, the accuracy of our results has been validated with ground truth data. Keywords: LiDAR, remote sensing, seismic vulnerability, Lorca

  17. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  18. Applying WCET Analysis at Architectural Level

    OpenAIRE

    Gilles, Olivier; Hugues, Jérôme

    2008-01-01

    Real-Time embedded systems must enforce strict timing constraints. In this context, achieving precise Worst Case Execution Time is a prerequisite to apply scheduling analysis and verify system viability. WCET analysis is usually a complex and time-consuming activity. It becomes increasingly complex when one also considers code generation strategies from high-level models. In this paper, we present an experiment made on the coupling of the WCET analysis tool Bound-T and our AADL to code ...

  19. Triangulation of Data Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Lauri, M

    2011-10-01

    Full Text Available In psychology, as in other disciplines, the concepts of validity and reliability are considered essential to give an accurate interpretation of results. While in quantitative research the idea is well established, in qualitative research, validity and reliability take on a different dimension. Researchers like Miles and Huberman (1994 and Silverman (2000, 2001, have shown how these issues are addressed in qualitative research. In this paper I am proposing that the same corpus of data, in this case the transcripts of focus group discussions, can be analysed using more than one data analysis technique. I refer to this idea as ‘triangulation of data analysis techniques’ and argue that such triangulation increases the reliability of the results. If the results obtained through a particular data analysis technique, for example thematic analysis, are congruent with the results obtained by analysing the same transcripts using a different technique, for example correspondence analysis, it is reasonable to argue that the analysis and interpretation of the data is valid.

  20. [Basics of PCR and related techniques applied in veterinary parasitology].

    Science.gov (United States)

    Ben Abderrazak, S

    2004-01-01

    We attempte through the following overall review pertaining to the basics of PCR techniques (Polymerase Chain Reaction), to introduce the main applications used in veterinary parasitology. A major problem restricting the application possibilities of molecular biology techniques is of quantitative nature. Amplification techniques represent a real revolution, for it makes possible the production of tens, even hundreds of nanogrammes of sequences when starting from very small quantities. The PCR technique has dramatically transformed the strategies used so far in molecular biology and subsequently research and medical diagnosis.

  1. Applying Frequency Map Analysis to the Australian Synchrotron Storage Ring

    CERN Document Server

    Tan, Yaw-Ren E; Le Blanc, Gregory Scott

    2005-01-01

    The technique of frequency map analysis has been applied to study the transverse dynamic aperture of the Australian Synchrotron Storage Ring. The results have been used to set the strengths of sextupoles to optimise the dynamic aperture. The effects of the allowed harmonics in the quadrupoles and dipole edge effects are discussed.

  2. Difficulties applying recent blind source separation techniques to EEG and MEG

    CERN Document Server

    Knuth, Kevin H

    2015-01-01

    High temporal resolution measurements of human brain activity can be performed by recording the electric potentials on the scalp surface (electroencephalography, EEG), or by recording the magnetic fields near the surface of the head (magnetoencephalography, MEG). The analysis of the data is problematic due to the fact that multiple neural generators may be simultaneously active and the potentials and magnetic fields from these sources are superimposed on the detectors. It is highly desirable to un-mix the data into signals representing the behaviors of the original individual generators. This general problem is called blind source separation and several recent techniques utilizing maximum entropy, minimum mutual information, and maximum likelihood estimation have been applied. These techniques have had much success in separating signals such as natural sounds or speech, but appear to be ineffective when applied to EEG or MEG signals. Many of these techniques implicitly assume that the source distributions hav...

  3. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  4. Influence of a laser profile in impedance mismatch techniques applied to carbon EOS measurement

    Institute of Scientific and Technical Information of China (English)

    A.Aliverdiev; D.Batani; R.Dezulian

    2013-01-01

    We present a recent numerical analysis of impedance mismatch technique applied to carbon equation of state measurements.We consider high-power laser pulses with a Gaussian temporal profile of different durations.We show that for the laser intensity(≈1014W/cm2)and the target design considered in this paper we need to have laser pulses with rise-time less than 150 ps.

  5. Adaptive Meshing Technique Applied to an Orthopaedic Finite Element Contact Problem

    OpenAIRE

    Roarty, Colleen M; Grosland, Nicole M.

    2004-01-01

    Finite element methods have been applied extensively and with much success in the analysis of orthopaedic implants.6,7,12,13,15 Recently a growing interest has developed, in the orthopaedic biomechanics community, in how numerical models can be constructed for the optimal solution of problems in contact mechanics. New developments in this area are of paramount importance in the design of improved implants for orthopaedic surgery. Finite element and other computational techniques are widely ap...

  6. Azimuthally Varying Noise Reduction Techniques Applied to Supersonic Jets

    Science.gov (United States)

    Heeb, Nicholas S.

    An experimental investigation into the effect of azimuthal variance of chevrons and fluidically enhanced chevrons applied to supersonic jets is presented. Flow field measurements of streamwise and cross-stream particle imaging velocimetry were employed to determine the causes of noise reduction, which was demonstrated through acoustic measurements. Results were obtained in the over- and under- expanded regimes, and at the design condition, though emphasis was placed on the overexpanded regime due to practical application. Surveys of chevron geometry, number, and arrangement were undertaken in an effort to reduce noise and/or incurred performance penalties. Penetration was found to be positively correlated with noise reduction in the overexpanded regime, and negatively correlated in underexpanded operation due to increased effective penetration and high frequency penalty, respectively. The effect of arrangement indicated the beveled configuration achieved optimal abatement in the ideally and underexpanded regimes due to superior BSAN reduction. The symmetric configuration achieved optimal overexpanded noise reduction due to LSS suppression from improved vortex persistence. Increases in chevron number generally improved reduction of all noise components for lower penetration configurations. Higher penetration configurations reached levels of saturation in the four chevron range, with the potential to introduce secondary shock structures and generate additional noise with higher number. Alternation of penetration generated limited benefit, with slight reduction of the high frequency penalty caused by increased shock spacing. The combination of alternating penetration with beveled and clustered configurations achieved comparable noise reduction to the standard counterparts. Analysis of the entire data set indicated initial improvements with projected area that saturated after a given level and either plateaued or degraded with additional increases. Optimal reductions

  7. Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  8. Applied surface analysis in magnetic storage technology

    Science.gov (United States)

    Windeln, Johannes; Bram, Christian; Eckes, Heinz-Ludwig; Hammel, Dirk; Huth, Johanna; Marien, Jan; Röhl, Holger; Schug, Christoph; Wahl, Michael; Wienss, Andreas

    2001-07-01

    This paper gives a synopsis of today's challenges and requirements for a surface analysis and materials science laboratory with a special focus on magnetic recording technology. The critical magnetic recording components, i.e. the protective carbon overcoat (COC), the disk layer structure, the read/write head including the giant-magnetoresistive (GMR) sensor, are described and options for their characterization with specific surface and structure analysis techniques are given. For COC investigations, applications of Raman spectroscopy to the structural analysis and determination of thickness, hydrogen and nitrogen content are discussed. Hardness measurements by atomic force microscopy (AFM) scratching techniques are presented. Surface adsorption phenomena on disk substrates or finished disks are characterized by contact angle analysis or so-called piezo-electric mass adsorption systems (PEMAS), also known as quartz crystal microbalance (QCM). A quickly growing field of applications is listed for various X-ray analysis techniques, such as disk magnetic layer texture analysis for X-ray diffraction, compositional characterization via X-ray fluorescence, compositional analysis with high lateral resolution via electron microprobe analysis. X-ray reflectometry (XRR) has become a standard method for the absolute measurement of individual layer thicknesses contained in multi-layer stacks and thus, is the successor of ellipsometry for this application. Due to the ongoing reduction of critical feature sizes, the analytical challenges in terms of lateral resolution, sensitivity limits and dedicated nano-preparation have been consistently growing and can only be met by state-of-the-art Auger electron spectrometers (AES), transmission electron microscopy (TEM) analysis, time-of-flight-secondary ion mass spectroscopy (ToF-SIMS) characterization, focused ion beam (FIB) sectioning and TEM lamella preparation via FIB. The depth profiling of GMR sensor full stacks was significantly

  9. Diagnostic techniques applied in geostatistics for agricultural data analysis Técnicas de diagnóstico utilizadas em geoestatística para análise de dados agrícolas

    Directory of Open Access Journals (Sweden)

    Joelmir André Borssoi

    2009-12-01

    Full Text Available The structural modeling of spatial dependence, using a geostatistical approach, is an indispensable tool to determine parameters that define this structure, applied on interpolation of values at unsampled points by kriging techniques. However, the estimation of parameters can be greatly affected by the presence of atypical observations in sampled data. The purpose of this study was to use diagnostic techniques in Gaussian spatial linear models in geostatistics to evaluate the sensitivity of maximum likelihood and restrict maximum likelihood estimators to small perturbations in these data. For this purpose, studies with simulated and experimental data were conducted. Results with simulated data showed that the diagnostic techniques were efficient to identify the perturbation in data. The results with real data indicated that atypical values among the sampled data may have a strong influence on thematic maps, thus changing the spatial dependence structure. The application of diagnostic techniques should be part of any geostatistical analysis, to ensure a better quality of the information from thematic maps.A modelagem da estrutura de dependência espacial pela abordagem da geoestatística é de fundamental importância para a definição de parâmetros que definem essa estrutura e que são utilizados na interpolação de valores em locais não amostrados, pela técnica de krigagem. Entretanto, a estimação de parâmetros pode ser muito alterada pela presença de observações atípicas nos dados amostrados. O desenvolvimento deste trabalho teve por objetivo utilizar técnicas de diagnóstico em modelos espaciais lineares gaussianos, empregados em geoestatística, para avaliar a sensibilidade dos estimadores de máxima verossimilhança e máxima verossimilhança restrita a pequenas perturbações nos dados. Foram realizados estudos de dados simulados e experimentais. O estudo com dados simulados mostrou que as técnicas de diagnóstico foram

  10. Filter back—projection technique applied to Abel inversion

    Institute of Scientific and Technical Information of China (English)

    JiangShano-En; LiuZhong-Li; 等

    1997-01-01

    The inverse Abel transform is applicable to optically thin plasma with cylindrical symmetry,which is often encountered in plasma physics and inertial(or magnetic)confinemant fusion.The filter back-projection technique is modified,and then a new method of inverse Abel transform is presented.

  11. Magnetic Solid Phase Extraction Applied to Food Analysis

    Directory of Open Access Journals (Sweden)

    Israel S. Ibarra

    2015-01-01

    Full Text Available Magnetic solid phase extraction has been used as pretreatment technique for the analysis of several compounds because of its advantages when it is compared with classic methods. This methodology is based on the use of magnetic solids as adsorbents for preconcentration of different analytes from complex matrices. Magnetic solid phase extraction minimizes the use of additional steps such as precipitation, centrifugation, and filtration which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique which were applied in food analysis.

  12. Recent Developments in Computational Techniques for Applied Hydrodynamics.

    Science.gov (United States)

    1979-12-07

    numerical solutions of the KdV equation . More significantly, solitons are seen in nature and in solutions of "exact" hyperbolic systems, i.e...1975). Tappert, F., " Numerical Solutions of the KdV Equation and Its Generalizations by the Split- Step Fourier Method," Lect. Appl. Math. 15, AMS...reverse side if necessary and identify by block nutmber) Recently developed techniques for numerical solution of fluid equations are reviewed.

  13. Technology Assessment of Dust Suppression Techniques Applied During Structural Demolition

    Energy Technology Data Exchange (ETDEWEB)

    Boudreaux, J.F.; Ebadian, M.A.; Williams, P.T.; Dua, S.K.

    1998-10-20

    Hanford, Fernald, Savannah River, and other sites are currently reviewing technologies that can be implemented to demolish buildings in a cost-effective manner. In order to demolish a structure properly and, at the same time, minimize the amount of dust generated from a given technology, an evaluation must be conducted to choose the most appropriate dust suppression technology given site-specific conditions. Thus, the purpose of this research, which was carried out at the Hemispheric Center for Environmental Technology (HCET) at Florida International University, was to conduct an experimental study of dust aerosol abatement (dust suppression) methods as applied to nuclear D and D. This experimental study targeted the problem of dust suppression during the demolition of nuclear facilities. The resulting data were employed to assist in the development of mathematical correlations that can be applied to predict dust generation during structural demolition.

  14. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    Science.gov (United States)

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  15. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  16. Unconventional Coding Technique Applied to Multi-Level Polarization Modulation

    Science.gov (United States)

    Rutigliano, G. G.; Betti, S.; Perrone, P.

    2016-05-01

    A new technique is proposed to improve information confidentiality in optical-fiber communications without bandwidth consumption. A pseudorandom vectorial sequence was generated by a dynamic system algorithm and used to codify a multi-level polarization modulation based on the Stokes vector. Optical-fiber birefringence, usually considered as a disturbance, was exploited to obfuscate the signal transmission. At the receiver end, the same pseudorandom sequence was generated and used to decode the multi-level polarization modulated signal. The proposed scheme, working at the physical layer, provides strong information security without introducing complex processing and thus latency.

  17. Applied Data Analysis in Energy Monitoring System

    Directory of Open Access Journals (Sweden)

    Kychkin А.V.

    2016-08-01

    Full Text Available Software and hardware system organization is presented as an example for building energy monitoring of multi-sectional lighting and climate control / conditioning needs. System key feature is applied office energy data analysis that allows to provide each type of hardware localized work mode recognition. It is based on general energy consumption profile with following energy consumption and workload evaluation. Applied data analysis includes primary data processing block, smoothing filter, time stamp identification block, clusterization and classification blocks, state change detection block, statistical data calculation block. Time slot consumed energy value and slot time stamp are taken as work mode classification main parameters. Energy data applied analysis with HIL and OpenJEVis visualization system usage experimental research results for chosen time period has been provided. Energy consumption, workload calculation and eight different states identification has been executed for two lighting sections and one climate control / conditioning emulating system by integral energy consumption profile. Research has been supported by university internal grant №2016/PI-2 «Methodology development of monitoring and heat flow utilization as low potential company energy sources».

  18. Compressed Sensing Techniques Applied to Ultrasonic Imaging of Cargo Containers.

    Science.gov (United States)

    López, Yuri Álvarez; Lorenzo, José Ángel Martínez

    2017-01-15

    One of the key issues in the fight against the smuggling of goods has been the development of scanners for cargo inspection. X-ray-based radiographic system scanners are the most developed sensing modality. However, they are costly and use bulky sources that emit hazardous, ionizing radiation. Aiming to improve the probability of threat detection, an ultrasonic-based technique, capable of detecting the footprint of metallic containers or compartments concealed within the metallic structure of the inspected cargo, has been proposed. The system consists of an array of acoustic transceivers that is attached to the metallic structure-under-inspection, creating a guided acoustic Lamb wave. Reflections due to discontinuities are detected in the images, provided by an imaging algorithm. Taking into consideration that the majority of those images are sparse, this contribution analyzes the application of Compressed Sensing (CS) techniques in order to reduce the amount of measurements needed, thus achieving faster scanning, without compromising the detection capabilities of the system. A parametric study of the image quality, as a function of the samples needed in spatial and frequency domains, is presented, as well as the dependence on the sampling pattern. For this purpose, realistic cargo inspection scenarios have been simulated.

  19. Compressed Sensing Techniques Applied to Ultrasonic Imaging of Cargo Containers

    Directory of Open Access Journals (Sweden)

    Yuri Álvarez López

    2017-01-01

    Full Text Available One of the key issues in the fight against the smuggling of goods has been the development of scanners for cargo inspection. X-ray-based radiographic system scanners are the most developed sensing modality. However, they are costly and use bulky sources that emit hazardous, ionizing radiation. Aiming to improve the probability of threat detection, an ultrasonic-based technique, capable of detecting the footprint of metallic containers or compartments concealed within the metallic structure of the inspected cargo, has been proposed. The system consists of an array of acoustic transceivers that is attached to the metallic structure-under-inspection, creating a guided acoustic Lamb wave. Reflections due to discontinuities are detected in the images, provided by an imaging algorithm. Taking into consideration that the majority of those images are sparse, this contribution analyzes the application of Compressed Sensing (CS techniques in order to reduce the amount of measurements needed, thus achieving faster scanning, without compromising the detection capabilities of the system. A parametric study of the image quality, as a function of the samples needed in spatial and frequency domains, is presented, as well as the dependence on the sampling pattern. For this purpose, realistic cargo inspection scenarios have been simulated.

  20. Compressed Sensing Techniques Applied to Ultrasonic Imaging of Cargo Containers

    Science.gov (United States)

    Álvarez López, Yuri; Martínez Lorenzo, José Ángel

    2017-01-01

    One of the key issues in the fight against the smuggling of goods has been the development of scanners for cargo inspection. X-ray-based radiographic system scanners are the most developed sensing modality. However, they are costly and use bulky sources that emit hazardous, ionizing radiation. Aiming to improve the probability of threat detection, an ultrasonic-based technique, capable of detecting the footprint of metallic containers or compartments concealed within the metallic structure of the inspected cargo, has been proposed. The system consists of an array of acoustic transceivers that is attached to the metallic structure-under-inspection, creating a guided acoustic Lamb wave. Reflections due to discontinuities are detected in the images, provided by an imaging algorithm. Taking into consideration that the majority of those images are sparse, this contribution analyzes the application of Compressed Sensing (CS) techniques in order to reduce the amount of measurements needed, thus achieving faster scanning, without compromising the detection capabilities of the system. A parametric study of the image quality, as a function of the samples needed in spatial and frequency domains, is presented, as well as the dependence on the sampling pattern. For this purpose, realistic cargo inspection scenarios have been simulated. PMID:28098841

  1. Filling-Based Techniques Applied to Object Projection Feature Estimation

    CERN Document Server

    Quesada, Luis

    2012-01-01

    3D motion tracking is a critical task in many computer vision applications. Unsupervised markerless 3D motion tracking systems determine the most relevant object in the screen and then track it by continuously estimating its projection features (center and area) from the edge image and a point inside the relevant object projection (namely, inner point), until the tracking fails. Existing object projection feature estimation techniques are based on ray-casting from the inner point. These techniques present three main drawbacks: when the inner point is surrounded by edges, rays may not reach other relevant areas; as a consequence of that issue, the estimated features may greatly vary depending on the position of the inner point relative to the object projection; and finally, increasing the number of rays being casted and the ray-casting iterations (which would make the results more accurate and stable) increases the processing time to the point the tracking cannot be performed on the fly. In this paper, we anal...

  2. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  3. Surgical treatment of scoliosis: a review of techniques currently applied

    Directory of Open Access Journals (Sweden)

    Maruyama Toru

    2008-04-01

    Full Text Available Abstract In this review, basic knowledge and recent innovation of surgical treatment for scoliosis will be described. Surgical treatment for scoliosis is indicated, in general, for the curve exceeding 45 or 50 degrees by the Cobb's method on the ground that: 1 Curves larger than 50 degrees progress even after skeletal maturity. 2 Curves of greater magnitude cause loss of pulmonary function, and much larger curves cause respiratory failure. 3 Larger the curve progress, more difficult to treat with surgery. Posterior fusion with instrumentation has been a standard of the surgical treatment for scoliosis. In modern instrumentation systems, more anchors are used to connect the rod and the spine, resulting in better correction and less frequent implant failures. Segmental pedicle screw constructs or hybrid constructs using pedicle screws, hooks, and wires are the trend of today. Anterior instrumentation surgery had been a choice of treatment for the thoracolumbar and lumbar scoliosis because better correction can be obtained with shorter fusion levels. Recently, superiority of anterior surgery for the thoracolumbar and lumbar scoliosis has been lost. Initial enthusiasm for anterior instrumentation for the thoracic curve using video assisted thoracoscopic surgery technique has faded out. Various attempts are being made with use of fusionless surgery. To control growth, epiphysiodesis on the convex side of the deformity with or without instrumentation is a technique to provide gradual progressive correction and to arrest the deterioration of the curves. To avoid fusion for skeletally immature children with spinal cord injury or myelodysplasia, vertebral wedge ostetomies are performed for the treatment of progressive paralytic scoliosis. For right thoracic curve with idiopathic scoliosis, multiple vertebral wedge osteotomies without fusion are performed. To provide correction and maintain it during the growing years while allowing spinal growth for

  4. Active Learning Techniques Applied to an Interdisciplinary Mineral Resources Course.

    Science.gov (United States)

    Aird, H. M.

    2015-12-01

    An interdisciplinary active learning course was introduced at the University of Puget Sound entitled 'Mineral Resources and the Environment'. Various formative assessment and active learning techniques that have been effective in other courses were adapted and implemented to improve student learning, increase retention and broaden knowledge and understanding of course material. This was an elective course targeted towards upper-level undergraduate geology and environmental majors. The course provided an introduction to the mineral resources industry, discussing geological, environmental, societal and economic aspects, legislation and the processes involved in exploration, extraction, processing, reclamation/remediation and recycling of products. Lectures and associated weekly labs were linked in subject matter; relevant readings from the recent scientific literature were assigned and discussed in the second lecture of the week. Peer-based learning was facilitated through weekly reading assignments with peer-led discussions and through group research projects, in addition to in-class exercises such as debates. Writing and research skills were developed through student groups designing, carrying out and reporting on their own semester-long research projects around the lasting effects of the historical Ruston Smelter on the biology and water systems of Tacoma. The writing of their mini grant proposals and final project reports was carried out in stages to allow for feedback before the deadline. Speakers from industry were invited to share their specialist knowledge as guest lecturers, and students were encouraged to interact with them, with a view to employment opportunities. Formative assessment techniques included jigsaw exercises, gallery walks, placemat surveys, think pair share and take-home point summaries. Summative assessment included discussion leadership, exams, homeworks, group projects, in-class exercises, field trips, and pre-discussion reading exercises

  5. Object detection techniques applied on mobile robot semantic navigation.

    Science.gov (United States)

    Astua, Carlos; Barber, Ramon; Crespo, Jonathan; Jardon, Alberto

    2014-04-11

    The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency.

  6. Object Detection Techniques Applied on Mobile Robot Semantic Navigation

    Directory of Open Access Journals (Sweden)

    Carlos Astua

    2014-04-01

    Full Text Available The future of robotics predicts that robots will integrate themselves more every day with human beings and their environments. To achieve this integration, robots need to acquire information about the environment and its objects. There is a big need for algorithms to provide robots with these sort of skills, from the location where objects are needed to accomplish a task up to where these objects are considered as information about the environment. This paper presents a way to provide mobile robots with the ability-skill to detect objets for semantic navigation. This paper aims to use current trends in robotics and at the same time, that can be exported to other platforms. Two methods to detect objects are proposed, contour detection and a descriptor based technique, and both of them are combined to overcome their respective limitations. Finally, the code is tested on a real robot, to prove its accuracy and efficiency.

  7. Signal Processing Techniques Applied in RFI Mitigation of Radio Astronomy

    Directory of Open Access Journals (Sweden)

    Sixiu Wang

    2012-08-01

    Full Text Available Radio broadcast and telecommunications are present at different power levels everywhere on Earth. Radio Frequency Interference (RFI substantially limits the sensitivity of existing radio telescopes in several frequency bands and may prove to be an even greater obstacle for next generation of telescopes (or arrays to overcome. A variety of RFI detection and mitigation techniques have been developed in recent years. This study describes various signal process methods of RFI mitigation in radio astronomy, choose the method of Time-frequency domain cancellation to eliminate certain interference and effectively improve the signal to noise ratio in pulsar observations. Finally, RFI mitigation researches and implements in China radio astronomy will be also presented.

  8. Applying Business Process Mode ling Techniques : Case Study

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2010-12-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were implemented in practice in recent decades. Most significant of the notations include ARIS, Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contempo-rary bus iness process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, me-thodology of res earch is discussed. The following section presents selected case study results. The paper is concluded with a summary

  9. Quantitative Portfolio Optimization Techniques Applied to the Brazilian Stock Market

    Directory of Open Access Journals (Sweden)

    André Alves Portela Santos

    2012-09-01

    Full Text Available In this paper we assess the out-of-sample performance of two alternative quantitative portfolio optimization techniques - mean-variance and minimum variance optimization – and compare their performance with respect to a naive 1/N (or equally-weighted portfolio and also to the market portfolio given by the Ibovespa. We focus on short selling-constrained portfolios and consider alternative estimators for the covariance matrices: sample covariance matrix, RiskMetrics, and three covariance estimators proposed by Ledoit and Wolf (2003, Ledoit and Wolf (2004a and Ledoit and Wolf (2004b. Taking into account alternative portfolio re-balancing frequencies, we compute out-of-sample performance statistics which indicate that the quantitative approaches delivered improved results in terms of lower portfolio volatility and better risk-adjusted returns. Moreover, the use of more sophisticated estimators for the covariance matrix generated optimal portfolios with lower turnover over time.

  10. Optical Trapping Techniques Applied to the Study of Cell Membranes

    Science.gov (United States)

    Morss, Andrew J.

    Optical tweezers allow for manipulating micron-sized objects using pN level optical forces. In this work, we use an optical trapping setup to aid in three separate experiments, all related to the physics of the cellular membrane. In the first experiment, in conjunction with Brian Henslee, we use optical tweezers to allow for precise positioning and control of cells in suspension to evaluate the cell size dependence of electroporation. Theory predicts that all cells porate at a transmembrane potential VTMof roughly 1 V. The Schwann equation predicts that the transmembrane potential depends linearly on the cell radius r, thus predicting that cells should porate at threshold electric fields that go as 1/r. The threshold field required to induce poration is determined by applying a low voltage pulse to the cell and then applying additional pulses of greater and greater magnitude, checking for poration at each step using propidium iodide dye. We find that, contrary to expectations, cells do not porate at a constant value of the transmembrane potential but at a constant value of the electric field which we find to be 692 V/cm for K562 cells. Delivering precise dosages of nanoparticles into cells is of importance for assessing toxicity of nanoparticles or for genetic research. In the second experiment, we conduct nano-electroporation—a novel method of applying precise doses of transfection agents to cells—by using optical tweezers in conjunction with a confocal microscope to manipulate cells into contact with 100 nm wide nanochannels. This work was done in collaboration with Pouyan Boukany of Dr. Lee's group. The small cross sectional area of these nano channels means that the electric field within them is extremely large, 60 MV/m, which allows them to electrophoretically drive transfection agents into the cell. We find that nano electroporation results in excellent dose control (to within 10% in our experiments) compared to bulk electroporation. We also find that

  11. Dust tracking techniques applied to the STARDUST facility: First results

    Energy Technology Data Exchange (ETDEWEB)

    Malizia, A., E-mail: malizia@ing.uniroma2.it [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Camplani, M. [Grupo de Tratamiento de Imágenes, E.T.S.I de Telecomunicación, Universidad Politécnica de Madrid (Spain); Gelfusa, M. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Lupelli, I. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); EURATOM/CCFE Association, Culham Science Centre, Abingdon (United Kingdom); Richetta, M.; Antonelli, L.; Conetta, F.; Scarpellini, D.; Carestia, M.; Peluso, E.; Bellecci, C. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy); Salgado, L. [Grupo de Tratamiento de Imágenes, E.T.S.I de Telecomunicación, Universidad Politécnica de Madrid (Spain); Video Processing and Understanding Laboratory, Universidad Autónoma de Madrid (Spain); Gaudio, P. [Associazione EURATOM-ENEA, Department of Industrial Engineering, University of Rome “Tor Vergata”, Via del Politecnico 1, 00133 Rome (Italy)

    2014-10-15

    Highlights: •Use of an experimental facility, STARDUST, to analyze the dust resuspension problem inside the tokamak in case of loss of vacuum accident. •PIV technique implementation to track the dust during a LOVA reproduction inside STARDUST. •Data imaging techniques to analyze dust velocity field: first results and data discussion. -- Abstract: An important issue related to future nuclear fusion reactors fueled with deuterium and tritium is the creation of large amounts of dust due to several mechanisms (disruptions, ELMs and VDEs). The dust size expected in nuclear fusion experiments (such as ITER) is in the order of microns (between 0.1 and 1000 μm). Almost the total amount of this dust remains in the vacuum vessel (VV). This radiological dust can re-suspend in case of LOVA (loss of vacuum accident) and these phenomena can cause explosions and serious damages to the health of the operators and to the integrity of the device. The authors have developed a facility, STARDUST, in order to reproduce the thermo fluid-dynamic conditions comparable to those expected inside the VV of the next generation of experiments such as ITER in case of LOVA. The dust used inside the STARDUST facility presents particle sizes and physical characteristics comparable with those that created inside the VV of nuclear fusion experiments. In this facility an experimental campaign has been conducted with the purpose of tracking the dust re-suspended at low pressurization rates (comparable to those expected in case of LOVA in ITER and suggested by the General Safety and Security Report ITER-GSSR) using a fast camera with a frame rate from 1000 to 10,000 images per second. The velocity fields of the mobilized dust are derived from the imaging of a two-dimensional slice of the flow illuminated by optically adapted laser beam. The aim of this work is to demonstrate the possibility of dust tracking by means of image processing with the objective of determining the velocity field values

  12. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  13. Time-resolved infrared spectroscopic techniques as applied to Channelrhodopsin

    Directory of Open Access Journals (Sweden)

    Eglof eRitter

    2015-07-01

    Full Text Available Among optogenetic tools, channelrhodopsins, the light gated ion channels of the plasma membrane from green algae, play the most important role. Properties like channel selectivity, timing parameters or color can be influenced by the exchange of selected amino acids. Although widely used, in the field of neurosciences for example, there is still little known about their photocycles and the mechanism of ion channel gating and conductance. One of the preferred methods for these studies is infrared spectroscopy since it allows observation of proteins and their function at a molecular level and in near-native environment. The absorption of a photon in channelrhodopsin leads to retinal isomerization within femtoseconds, the conductive states are reached in the microsecond time scale and the return into the fully dark-adapted state may take more than minutes. To be able to cover all these time regimes, a range of different spectroscopical approaches are necessary. This mini-review focuses on time-resolved applications of the infrared technique to study channelrhodopsins and other light triggered proteins. We will discuss the approaches with respect to their suitability to the investigation of channelrhodopsin and related proteins.

  14. Acoustic Emission Technique Applied in Textiles Mechanical Characterization

    Directory of Open Access Journals (Sweden)

    Rios-Soberanis Carlos Rolando

    2017-01-01

    Full Text Available The common textile architecture/geometry are woven, braided, knitted, stitch boded, and Z-pinned. Fibres in textile form exhibit good out-of-plane properties and good fatigue and impact resistance, additionally, they have better dimensional stability and conformability. Besides the nature of the textile, the architecture has a great role in the mechanical behaviour and mechanisms of damage in textiles, therefore damage mechanisms and mechanical performance in structural applications textiles have been a major concern. Mechanical damage occurs to a large extent during the service lifetime consequently it is vital to understand the material mechanical behaviour by identifying its mechanisms of failure such as onset of damage, crack generation and propagation. In this work, textiles of different architecture were used to manufacture epoxy based composites in order to study failure events under tensile load by using acoustic emission technique which is a powerful characterization tool due to its link between AE data and fracture mechanics, which makes this relation a very useful from the engineering point of view.

  15. Cleaning techniques for applied-B ion diodes

    Energy Technology Data Exchange (ETDEWEB)

    Cuneo, M.E.; Menge, P.R.; Hanson, D.L. [and others

    1995-09-01

    Measurements and theoretical considerations indicate that the lithium-fluoride (LiF) lithium ion source operates by electron-assisted field-desorption, and provides a pure lithium beam for 10--20 ns. Evidence on both the SABRE (1 TW) and PBFA-II (20 TW) accelerators indicates that the lithium beam is replaced by a beam of protons, and carbon resulting from electron thermal desorption of hydrocarbon surface and bulk contamination with subsequent avalanche ionization. Appearance of contaminant ions in the beam is accompanied by rapid impedance collapse, possibly resulting from loss of magnetic insulation in the rapidly expanding and ionizing, neutral layer. Electrode surface and source substrate cleaning techniques are being developed on the SABRE accelerator to reduce beam contamination, plasma formation, and impedance collapse. We have increased lithium current density a factor of 3 and lithium energy a factor of 5 through a combination of in-situ surface and substrate coatings, impermeable substrate coatings, and field profile modifications.

  16. Sputtering as a Technique for Applying Tribological Coatings

    Science.gov (United States)

    Ramalingam, S.

    1984-01-01

    Friction and wear-induced mechanical failures may be controlled to extend the life of tribological components through the interposition of selected solid materials between contacting surfaces. Thin solid films of soft and hard materials are appropriate to lower friction and enhance the wear resistance of precision tribo-elements. Tribological characteristics of thin hard coats deposited on a variety of ferrous and non-ferrous substrates were tested. The thin hard coats used were titanium nitride films deposited by reactive magnetron sputtering of metallic titanium. High contact stress, low speed tests showed wear rate reductions of one or more magnitude, even with films a few micrometers in thickness. Low contact stress, high speed tests carried out under rather severe test conditions showed that thin films of TiN afforded significant friction reduction and wear protection. Thin hard coats were shown to improve the friction and wear performance of rolling contacts. Satisfactory film-to-substrate adhesion strengths can be obtained with reactive magnetron sputtering. X-ray diffraction and microhardness tests were employed to assess the effectiveness of the sputtering technique.

  17. Applying traditional signal processing techniques to social media exploitation for situational understanding

    Science.gov (United States)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  18. Applying data mining techniques to improve diagnosis in neonatal jaundice

    Directory of Open Access Journals (Sweden)

    Ferreira Duarte

    2012-12-01

    Full Text Available Abstract Background Hyperbilirubinemia is emerging as an increasingly common problem in newborns due to a decreasing hospital length of stay after birth. Jaundice is the most common disease of the newborn and although being benign in most cases it can lead to severe neurological consequences if poorly evaluated. In different areas of medicine, data mining has contributed to improve the results obtained with other methodologies. Hence, the aim of this study was to improve the diagnosis of neonatal jaundice with the application of data mining techniques. Methods This study followed the different phases of the Cross Industry Standard Process for Data Mining model as its methodology. This observational study was performed at the Obstetrics Department of a central hospital (Centro Hospitalar Tâmega e Sousa – EPE, from February to March of 2011. A total of 227 healthy newborn infants with 35 or more weeks of gestation were enrolled in the study. Over 70 variables were collected and analyzed. Also, transcutaneous bilirubin levels were measured from birth to hospital discharge with maximum time intervals of 8 hours between measurements, using a noninvasive bilirubinometer. Different attribute subsets were used to train and test classification models using algorithms included in Weka data mining software, such as decision trees (J48 and neural networks (multilayer perceptron. The accuracy results were compared with the traditional methods for prediction of hyperbilirubinemia. Results The application of different classification algorithms to the collected data allowed predicting subsequent hyperbilirubinemia with high accuracy. In particular, at 24 hours of life of newborns, the accuracy for the prediction of hyperbilirubinemia was 89%. The best results were obtained using the following algorithms: naive Bayes, multilayer perceptron and simple logistic. Conclusions The findings of our study sustain that, new approaches, such as data mining, may support

  19. Feasibility of Applying Controllable Lubrication Techniques to Reciprocating Machines

    DEFF Research Database (Denmark)

    Pulido, Edgar Estupinan

    modified hydrostatic lubrication. In this case, the hydrostatic lubrication is modified by injecting oil at controllable pressures, through orifices circumferentially located around the bearing surface. In order to study the performance of journal bearings of reciprocating machines, operating under...... conventional lubrication conditions, a mathematical model of a reciprocating mechanism connected to a rigid / flexible rotor via thin fluid films was developed. The mathematical model involves the use of multibody dynamics theory for the modelling of the reciprocating mechanism (rigid bodies), finite elements...... of the reciprocating engine, obtained with the help of multibody dynamics (rigid components) and finite elements method (flexible components), and the global system of equations is numerically solved. The analysis of the results was carried out with focus on the behaviour of the journal orbits, maximum fluid film...

  20. Design of an operational transconductance amplifier applying multiobjective optimization techniques

    Directory of Open Access Journals (Sweden)

    Roberto Pereira-Arroyo

    2014-02-01

    Full Text Available In this paper, the problem at hand consists in the sizing of an Operational Transconductance Amplifier (OTA. The Pareto front is introduced as a useful analysis concept in order to explore the design space of such analog circuit. A genetic algorithm (GA is employed to automatically detect this front in a process that efficiently finds optimal parameteriza­tions and their corresponding values in an aggregate fitness space. Since the problem is treated as a multi-objective optimization task, different measures of the amplifier like the transconductance, the slew rate, the linear range and the input capacitance are used as fitness functions. Finally, simulation results are pre­sented, using a standard 0,5μm CMOS technology.

  1. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  2. Thermal analysis applied to irradiated propolis

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; Mastro, N.L. del E-mail: nelida@usp.br

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were {sup 60}Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600 deg. C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  3. Multivariate analysis applied to tomato hybrid production.

    Science.gov (United States)

    Balasch, S; Nuez, F; Palomares, G; Cuartero, J

    1984-11-01

    Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.

  4. Exergy analysis applied to biodiesel production

    Energy Technology Data Exchange (ETDEWEB)

    Talens, Laura; Villalba, Gara [SosteniPra UAB-IRTA. Environmental Science and Technology Institute (ICTA), Edifici Cn, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra, Cerdanyola del Valles, Barcelona (Spain); Gabarrell, Xavier [SosteniPra UAB-IRTA. Environmental Science and Technology Institute ICTA, Edifici Cn, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra (Cerdanyola del Valles), Barcelona (Spain); Department of Chemical Engineering, Universitat Autonoma de Barcelona (UAB), 08193 Bellaterra Cerdanyola del Valles, Barcelona (Spain)

    2007-08-15

    In our aim to decrease the consumption of materials and energy and promote the use of renewable resources, such as biofuels, rises the need to measure materials and energy fluxes. This paper suggests the use of Exergy Flow Analysis (ExFA) as an environmental assessment tool to account wastes and emissions, determine the exergetic efficiency, compare substitutes and other types of energy sources: all useful in defining environmental and economical policies for resource use. In order to illustrate how ExFA is used, it is applied to the process of biodiesel production. The results show that the production process has a low exergy loss (492 MJ). The exergy loss is reduced by using potassium hydroxide and sulphuric acid as process catalysts and it can be further minimised by improving the quality of the used cooking oil. (author)

  5. Prefractionation techniques in proteome analysis.

    Science.gov (United States)

    Righetti, Pier Giorgio; Castagna, Annalisa; Herbert, Ben; Reymond, Frederic; Rossier, Joël S

    2003-08-01

    The present review deals with a number of prefractionation protocols in preparation for two-dimensional map analysis, both in the fields of chromatography and in the field of electrophoresis. In the first case, Fountoulaki's groups has reported just about any chromatographic procedure useful as a prefractionation step, including affinity, ion-exchange, and reversed-phase resins. As a result of the various enrichment steps, several hundred new species, previously undetected in unfractionated samples, could be revealed for the first time. Electrophoretic prefractionation protocols include all those electrokinetic methodologies which are performed in free solution, essentially all relying on isoelectric focusing steps. The devices here reviewed include multichamber apparatus, such as the multicompartment electrolyzer with Immobiline membranes, Off-Gel electrophoresis in a multicup device and the Rotofor, an instrument also based on a multichamber system but exploiting the conventional technique of carrier-ampholyte-focusing. Other instruments of interest are the Octopus, a continuous-flow device for isoelectric focusing in a upward flowing liquid curtain, and the Gradiflow, where different pI cuts are obtained by a multistep passage through two compartments buffered at different pH values. It is felt that this panoply of methods could offer a strong step forward in "mining below the tip of the iceberg" for detecting the "unseen proteome".

  6. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  7. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  8. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  9. A comparison of wavelet analysis techniques in digital holograms

    Science.gov (United States)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  10. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  11. Analytical techniques in pharmaceutical analysis: A review

    Directory of Open Access Journals (Sweden)

    Masoom Raza Siddiqui

    2017-02-01

    Full Text Available The development of the pharmaceuticals brought a revolution in human health. These pharmaceuticals would serve their intent only if they are free from impurities and are administered in an appropriate amount. To make drugs serve their purpose various chemical and instrumental methods were developed at regular intervals which are involved in the estimation of drugs. These pharmaceuticals may develop impurities at various stages of their development, transportation and storage which makes the pharmaceutical risky to be administered thus they must be detected and quantitated. For this analytical instrumentation and methods play an important role. This review highlights the role of the analytical instrumentation and the analytical methods in assessing the quality of the drugs. The review highlights a variety of analytical techniques such as titrimetric, chromatographic, spectroscopic, electrophoretic, and electrochemical and their corresponding methods that have been applied in the analysis of pharmaceuticals.

  12. Nuclear and conventional techniques applied to the analysis of prehispanic metals of the Templo Mayor of Tenochtitlan; Tecnicas nucleares y convencionales aplicadas al analisis de metales prehispanicos del Templo Mayor de Tenochtitlan

    Energy Technology Data Exchange (ETDEWEB)

    Mendez M, U

    2003-07-01

    The use of the such experimental techniques as: PIXE, RBS, Metallography and Sem, applied to the characterization of pre hispanic metals of copper and gold coming from 9 offerings of the Templo Mayor of Tenochtitlan, are possible to obtain results and information sustained on such aspects as technological development and cultural and commercial exchange besides a relative chronology, as well as aspects related with conservation, authenticity, symbolic association and social meaning of the offerings. After way but it specifies, it will be given to know each one of the objectives outlined for this study: To carry out interpretations on technical of factory, stylistic designs and cultural and commercial exchanges starting from aspects like: microstructure, elementary composition, type of alloys, welding existence, golden superficial, and conservation, they can be had. To determine the technological advance that means the prosecution of the metallic materials and to know their location in the archaeological context, as a means for the interpretation of the social significance of the offering. To know the possible association symbolic-religious from the metallic objects offering to the deities; starting from significant characteristics as they are: color, forms and function. To establish if it is possible to know if the devices found in the offerings are of the same temporality in which one carries out this, or at least, to locate to the devices inside the two stages of the development of the metallurgy these they are known as the period of the native copper and the period of the alloys, this helped to determine a relative chronology of when the objects were manufactured. To confirm the authenticity of the devices. To determine, in a way specifies, the conservation grade in that they are the pieces. To corroborate some of the manufacture processes This is achieved by means of the reproduction of objects in laboratory, to establish comparisons and differences among pre

  13. Operations research techniques applied to service center logistics in power distribution users

    Directory of Open Access Journals (Sweden)

    Maria Teresinha Arns Steiner

    2006-12-01

    Full Text Available This paper deals with the optimization for the logistics regarding services demanded byusers of power distribution lines, served by the Portão office, located in Curitiba, PR, Brazil,and operated by COPEL (Paranaense Power Company. Through the use of OperationsResearch techniques, an Integer Programming Mathematical model and Floyd Algorithm, amethod was defined to determine in an optimized way, the number of teams needed by theselected office, as well as, the optimized assignment for the teams to the sites in need, inorder to offer efficient services to the users and, besides that, the immediate execution onemergencies and, as to the other services, accordingly to parameters set by the NationalPower Agency together with COPEL. The methodology hereby presented is generic, so thatit could be applied to any power network (or any of its lines, and it has presented verysatisfactory results to the case in analysis.

  14. Temporal Fourier analysis applied to equilibrium radionuclide cineangiography

    Energy Technology Data Exchange (ETDEWEB)

    Cardot, J.C.; Verdenet, J.; Bidet, A.; Bidet, R.; Berthout, P.; Faivre, R.; Bassand, J.P.; Maurat, J.P.

    1982-08-01

    Regional and global left ventricular wall motion was assessed in 120 patients using radionulcide cincangiography (RCA) and contrast angiography. Functional imaging procedures based on a temporal Fourier analysis of dynamic image sequences were applied to the study of cardiac contractility. Two images were constructed by taking the phase and amplitude values of the first harmonic in the Fourier transform for each pixel. These two images aided in determining the perimeter of the left ventricle to calculate the global ejection fraction. Regional left ventricular wall motion was studied by analyzing the phase value and by examining the distribution histogram of these values. The accuracy of global ejection fraction calculation was improved by the Fourier technique. This technique increased the sensitivity of RCA for determining segmental abnormalities especially in the left anterior oblique view (LAO).

  15. Towards a human eye behavior model by applying Data Mining Techniques on Gaze Information from IEC

    CERN Document Server

    Pallez, Denis; Baccino, Thierry

    2008-01-01

    In this paper, we firstly present what is Interactive Evolutionary Computation (IEC) and rapidly how we have combined this artificial intelligence technique with an eye-tracker for visual optimization. Next, in order to correctly parameterize our application, we present results from applying data mining techniques on gaze information coming from experiments conducted on about 80 human individuals.

  16. Applying Analytical Derivative and Sparse Matrix Techniques to Large-Scale Process Optimization Problems

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The performance of analytical derivative and sparse matrix techniques applied to a traditional densesequential quadratic programming(SQP) is studied, and the strategy utilizing those techniques is also presented. Computational results on two typicalchemical optimization problems demonstrate significant enhancement in efficiency, which shows this strategy ispromising and suitable for large-scale process optimization problems.

  17. Adaptive meshing technique applied to an orthopaedic finite element contact problem.

    Science.gov (United States)

    Roarty, Colleen M; Grosland, Nicole M

    2004-01-01

    Finite element methods have been applied extensively and with much success in the analysis of orthopaedic implants. Recently a growing interest has developed, in the orthopaedic biomechanics community, in how numerical models can be constructed for the optimal solution of problems in contact mechanics. New developments in this area are of paramount importance in the design of improved implants for orthopaedic surgery. Finite element and other computational techniques are widely applied in the analysis and design of hip and knee implants, with additional joints (ankle, shoulder, wrist) attracting increased attention. The objective of this investigation was to develop a simplified adaptive meshing scheme to facilitate the finite element analysis of a dual-curvature total wrist implant. Using currently available software, the analyst has great flexibility in mesh generation, but must prescribe element sizes and refinement schemes throughout the domain of interest. Unfortunately, it is often difficult to predict in advance a mesh spacing that will give acceptable results. Adaptive finite-element mesh capabilities operate to continuously refine the mesh to improve accuracy where it is required, with minimal intervention by the analyst. Such mesh adaptation generally means that in certain areas of the analysis domain, the size of the elements is decreased (or increased) and/or the order of the elements may be increased (or decreased). In concept, mesh adaptation is very appealing. Although there have been several previous applications of adaptive meshing for in-house FE codes, we have coupled an adaptive mesh formulation with the pre-existing commercial programs PATRAN (MacNeal-Schwendler Corp., USA) and ABAQUS (Hibbit Karlson and Sorensen, Pawtucket, RI). In doing so, we have retained several attributes of the commercial software, which are very attractive for orthopaedic implant applications.

  18. Applying BI Techniques To Improve Decision Making And Provide Knowledge Based Management

    Directory of Open Access Journals (Sweden)

    Alexandra Maria Ioana FLOREA

    2015-07-01

    Full Text Available The paper focuses on BI techniques and especially data mining algorithms that can support and improve the decision making process, with applications within the financial sector. We consider the data mining techniques to be more efficient and thus we applied several techniques, supervised and unsupervised learning algorithms The case study in which these algorithms have been implemented regards the activity of a banking institution, with focus on the management of lending activities.

  19. Colilert® applied to food analysis

    Directory of Open Access Journals (Sweden)

    Maria José Rodrigues

    2014-06-01

    Full Text Available Colilert® (IDEXX was originally developed for the simultaneous enumeration of coliforms and E. coli in water samples and has been used for the quality control routine of drinking, swimming pools, fresh, coastal and waste waters (Grossi et al., 2013. The Colilert® culture medium contains the indicator nutrient 4-Methylumbelliferyl-β-D-Glucuronide (MUG. MUG acts as a substrate for the E. coli enzyme β-glucuronidase, from which a fluorescent compound is produced. A positive MUG result produces fluorescence when viewed under an ultraviolet lamp. If the test fluorescence is equal to or greater than that of the control, the presence of E. coli has been confirmed (Lopez-Roldan et al., 2013. The present work aimed to apply Colilert® to the enumeration of E. coli in different foods, through the comparison of results against the reference method (ISO 16649-2, 2001 for E. coli food analysis. The study was divided in two stages. During the first stage ten different types of foods were analyzed with Colilert®, these included pastry, raw meat, ready to eat meals, yogurt, raw seabream and salmon, and cooked shrimp. From these it were approved the following: pastry with custard; raw minced pork; soup "caldo-verde"; raw vegetable salad (lettuce and carrots and solid yogurt. The approved foods presented a better insertion in the tray, the colour of the wells was lighter and the UV reading was easier. In the second stage the foods were artificially contaminated with 2 log/g of E. coli (ATCC 25922 and analyzed. Colilert® proved to be an accurate method and the counts were similar to the ones obtained with the reference method. In the present study, the Colilert® method did not reveal neither false-positive or false-negative results, however sometimes the results were difficult to read due to the presence of green fluorescence in some wells. Generally Colilert® was an easy and rapid method, but less objective and more expensive than the reference method.

  20. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  1. Functional Analysis in Applied Mathematics and Engineering

    DEFF Research Database (Denmark)

    Pedersen, Michael

    1997-01-01

    Lecture notes for the course 01245 Functional Analysis. Consists of the first part of amonograph with the same title.......Lecture notes for the course 01245 Functional Analysis. Consists of the first part of amonograph with the same title....

  2. Functional analysis in modern applied mathematics

    CERN Document Server

    Curtain, Ruth F

    1977-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  3. Data Selection for Fast Projection Techniques Applied to Adaptive Nulling: A Comparative Study of Performance

    Science.gov (United States)

    1991-12-01

    point de vue d’annulation des brouilleurs, le dernier 6tant moins rapide mais donnant une meilleure annulation. En effet , ces algorithmes donnent un...techniques avec celui de la technique "sample matrix inversion ou SMI" pour trois scenarios diffdrents; ces trois derniers ddmontrent les effets du nombre de...eigenvector analysis, such as the MUSIC technique [2], are effective for both interference suppression and spectral estimation. These techniques yield

  4. Research on key techniques of virtual reality applied in mining industry

    Institute of Scientific and Technical Information of China (English)

    LIAO Jun; LU Guo-bin

    2009-01-01

    Based on the applications of virtual reality technology in many fields, introduced the virtual reality technical basic concept, structure type, related technique development, etc., tallied up applications of virtual reality technique in the present mining industry, inquired into core techniques related software and hardware, especially the optimization in the setup of various 3D models technique, and carried out a virtual scene to travel extensively in real-time by stereoscopic manifestation technique and so on. Then it brought forward the solution of virtual reality technique with software and hardware to the mining industry that can satisfy the demand of different aspects and levers. Finally, it show a fine prospect of virtual reality technique applied in the mining industry.

  5. Introduction: Conversation Analysis in Applied Linguistics

    Science.gov (United States)

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  6. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  7. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    the worlds of statistics and chemometrics. We want to provide a glimpse of the essential and complex data pre-processing that is well known to chemometricians, but is generally unknown to statisticians. Pre-processing can potentially have a strong in uence on the results of consequent data analysis. Our......In this thesis we explore the use of functional data analysis as a method to analyse chemometric data, more specically spectral data in metabolomics. Functional data analysis is a vibrant eld in statistics. It has been rapidly expanding in both methodology and applications since it was made well...... known by Ramsay & Silverman's monograph in 1997. In functional data analysis, the data are curves instead of data points. Each curve is measured at discrete points along a continuum, for example, time or frequency. It is assumed that the underlying process generating the curves is smooth...

  8. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  9. Spatial analysis methodology applied to rural electrification

    Energy Technology Data Exchange (ETDEWEB)

    Amador, J. [Department of Electric Engineering, EUTI, UPM, Ronda de Valencia, E-28012 Madrid (Spain); Dominguez, J. [Renewable Energies Division, CIEMAT, Av. Complutense 22, E-28040 Madrid (Spain)

    2006-08-15

    The use of geographical information systems (GISs) in studies of regional integration of renewable energies provides advantages such as speed, amount of information, analysis capacity and others. However, these characteristics make it difficult to link the results to the initial variables, and therefore to validate the GIS. This makes it hard to ascertain the reliability of both the results and their subsequent analysis. To solve these problems, a GIS-based method is proposed with renewable energies for rural electrification structured in three stages, with the aim of finding out the influence of the initial variables on the result. In the first stage, a classic sensitivity analysis of the equivalent electrification cost (LEC) is performed; the second stage involves a spatial sensitivity analysis and the third determines the stability of the results. This methodology has been verified in the application of a GIS in Lorca (Spain). (author)

  10. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea.

    Science.gov (United States)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  11. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    Energy Technology Data Exchange (ETDEWEB)

    Palit, Mousumi [Department of Electronics and Telecommunication Engineering, Central Calcutta Polytechnic, Kolkata 700014 (India); Tudu, Bipan, E-mail: bt@iee.jusl.ac.in [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Bhattacharyya, Nabarun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Dutta, Ankur; Dutta, Pallab Kumar [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Jana, Arun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Bandyopadhyay, Rajib [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Chatterjee, Anutosh [Department of Electronics and Communication Engineering, Heritage Institute of Technology, Kolkata 700107 (India)

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  12. An elementary introduction to applied signal analysis

    DEFF Research Database (Denmark)

    Jacobsen, Finn

    2000-01-01

    An introduction to some of the most fundamental concepts and methods of signal analysis and signal processing is presented with particular regard to acoustic measurements. The purpose is to give the reader so much basic knowledge of signal analysis that he can use modern digital equipment in some...... of the most important acoustic measurements, eg measurements of transfer functions of lightly damped multi-modal systems (rooms and structures)....

  13. An elementary introduction to applied signal analysis

    DEFF Research Database (Denmark)

    Jacobsen, Finn

    1997-01-01

    An introduction to some of the most fundamental concepts and methods of signal analysis and signal processing is presented with particular regard to acoustic measurements. The purpose is to give the reader so much basic knowledge of signal analysis that he can use modern digital equipment in some...... of the most important acoustic measurements, eg measurements of transfer functions of lightly damped multi-modal systems (rooms and structures)....

  14. Nuclear and conventional techniques applied to the analysis of Purhepecha metals of the Pareyon collection; Tecnicas nucleares y convencionales aplicadas al analisis de metales Purhepecha de la coleccion Pareyon

    Energy Technology Data Exchange (ETDEWEB)

    Mendez, U.; Tenorio C, D. [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico); Ruvalcaba, J.L. [IFUNAM, 04510 Mexico D.F. (Mexico); Lopez, J.A. [Instituto Nacional de Antropologia e Historia, 11000 Mexico D.F. (Mexico)

    2005-07-01

    The main objective of this investigation was to determine the composition and microstructure of 13 metallic devices by means of the nuclear techniques of PIXE, RBS and conventional; which were elaborated starting from copper and gold, and they were in the offering of a tarasc personage located in the 'Matamoros' porch in Uruapan, Michoacan, Mexico. (Author)

  15. Applied modal analysis of wind turbine blades

    DEFF Research Database (Denmark)

    Pedersen, H.B.; Kristensen, O.J.D.

    2003-01-01

    In this project modal analysis has been used to determine the natural frequencies, damping and the mode shapes for wind turbine blades. Different methods to measure the position and adjust the direction of the measuring points are discussed. Differentequipment for mounting the accelerometers...... is investigated by repeated measurement on the same wind turbine blade. Furthermore the flexibility of the test set-up is investigated, by use ofaccelerometers mounted on the flexible adapter plate during the measurement campaign. One experimental campaign investigated the results obtained from a loaded...... and unloaded wind turbine blade. During this campaign the modal analysis are performed on ablade mounted in a horizontal and a vertical position respectively. Finally the results obtained from modal analysis carried out on a wind turbine blade are compared with results obtained from the Stig Øyes blade_EV1...

  16. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  17. Applying Image Matching to Video Analysis

    Science.gov (United States)

    2010-09-01

    Database of Spent Cartridge Cases of Firearms". Forensic Science International . Page(s) 97-106. 2001. 21: Birchfield, S. "Derivation of Kanade-Lucas-Tomasi...Ortega-Garcia, J. "Bayesian Analysis of Fingerprint, Face and Signature Evidences with Automatic Biometric Systems". Forensic Science International . Vol

  18. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    Science.gov (United States)

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  19. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  20. Applying centrality measures to impact analysis: A coauthorship network analysis

    CERN Document Server

    Yan, Erjia

    2010-01-01

    Many studies on coauthorship networks focus on network topology and network statistical mechanics. This article takes a different approach by studying micro-level network properties, with the aim to apply centrality measures to impact analysis. Using coauthorship data from 16 journals in the field of library and information science (LIS) with a time span of twenty years (1988-2007), we construct an evolving coauthorship network and calculate four centrality measures (closeness, betweenness, degree and PageRank) for authors in this network. We find out that the four centrality measures are significantly correlated with citation counts. We also discuss the usability of centrality measures in author ranking, and suggest that centrality measures can be useful indicators for impact analysis.

  1. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun

    2000-05-01

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques.

  2. Strategies and techniques of communication and public relations applied to non-profit sector

    Directory of Open Access Journals (Sweden)

    Ioana – Julieta Josan

    2010-05-01

    Full Text Available The aim of this paper is to summarize the strategies and techniques of communication and public relations applied to non-profit sector.The approach of the paper is to identify the most appropriate strategies and techniques that non-profit sector can use to accomplish its objectives, to highlight specific differences between the strategies and techniques of the profit and non-profit sectors and to identify potential communication and public relations actions in order to increase visibility among target audience, create brand awareness and to change into positive brand sentiment the target perception about the non-profit sector.

  3. Nuclear techniques (PIXE and RBS) applied to analysis of pre hispanic metals of the Templo Mayor at Tenochtitlan; Tecnicas nucleares (PIXE y RBS) aplicadas al analisis de metales prehispanicos del Templo Mayor de Tenochtitlan

    Energy Technology Data Exchange (ETDEWEB)

    Mendez U, I.; Tenorio, D.; Galvan, J.L. [Instituto Nacional de Investigaciones Nucleares, A.P. 18-1027, 11801 Mexico D.F. (Mexico)

    2000-07-01

    This work has the objective of determining by means of the utilization of nuclear techniques (PIXE and RBS) the composition and the alloy type of diverse aztec ornaments corresponding to Post classic period, they were manufactured principally with copper and gold such as bells, beads and disks; all they belonging at 9 oblations of Templo Mayor of Tenochtitlan. It is presented here briefly the historical and archaeological antecedents of the devices as well as the analytical methods for conclude with the results obtained. (Author)

  4. Zero order and signal processing spectrophotometric techniques applied for resolving interference of metronidazole with ciprofloxacin in their pharmaceutical dosage form.

    Science.gov (United States)

    Attia, Khalid A M; Nassar, Mohammed W I; El-Zeiny, Mohamed B; Serag, Ahmed

    2016-02-01

    Four rapid, simple, accurate and precise spectrophotometric methods were used for the determination of ciprofloxacin in the presence of metronidazole as interference. The methods under study are area under the curve, simultaneous equation in addition to smart signal processing techniques of manipulating ratio spectra namely Savitsky-Golay filters and continuous wavelet transform. All the methods were validated according to the ICH guidelines where accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can therefore be used for the routine analysis of ciprofloxacin in quality-control laboratories.

  5. Zero order and signal processing spectrophotometric techniques applied for resolving interference of metronidazole with ciprofloxacin in their pharmaceutical dosage form

    Science.gov (United States)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-02-01

    Four rapid, simple, accurate and precise spectrophotometric methods were used for the determination of ciprofloxacin in the presence of metronidazole as interference. The methods under study are area under the curve, simultaneous equation in addition to smart signal processing techniques of manipulating ratio spectra namely Savitsky-Golay filters and continuous wavelet transform. All the methods were validated according to the ICH guidelines where accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can therefore be used for the routine analysis of ciprofloxacin in quality-control laboratories.

  6. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  7. Photometric analysis applied in determining facial type

    Directory of Open Access Journals (Sweden)

    Luciana Flaquer Martins

    2012-10-01

    Full Text Available INTRODUCTION: In orthodontics, determining the facial type is a key element in the prescription of a correct diagnosis. In the early days of our specialty, observation and measurement of craniofacial structures were done directly on the face, in photographs or plaster casts. With the development of radiographic methods, cephalometric analysis replaced the direct facial analysis. Seeking to validate the analysis of facial soft tissues, this work compares two different methods used to determining the facial types, the anthropometric and the cephalometric methods. METHODS: The sample consisted of sixty-four Brazilian individuals, adults, Caucasian, of both genders, who agreed to participate in this research. All individuals had lateral cephalograms and facial frontal photographs. The facial types were determined by the Vert Index (cephalometric and the Facial Index (photographs. RESULTS: The agreement analysis (Kappa, made for both types of analysis, found an agreement of 76.5%. CONCLUSIONS: We concluded that the Facial Index can be used as an adjunct to orthodontic diagnosis, or as an alternative method for pre-selection of a sample, avoiding that research subjects have to undergo unnecessary tests.INTRODUÇÃO: em Ortodontia, a determinação do tipo facial é um elemento-chave na prescrição de um diagnóstico correto. Nos primórdios de nossa especialidade, a observação e a medição das estruturas craniofaciais eram feitas diretamente na face, em fotografias ou em modelos de gesso. Com o desenvolvimento dos métodos radiográficos, a análise cefalométrica foi substituindo a análise facial direta. Visando legitimar o estudo dos tecidos moles faciais, esse trabalho comparou a determinação do tipo facial pelos métodos antropométrico e cefalométrico. MÉTODOS: a amostra constou de sessenta e quatro indivíduos brasileiros, adultos, leucodermas, de ambos os sexos, que aceitaram participar da pesquisa. De todos os indivíduos da amostra

  8. Recommendations for learners are different: Applying memory-based recommender system techniques to lifelong learning

    NARCIS (Netherlands)

    Drachsler, Hendrik; Hummel, Hans; Koper, Rob

    2007-01-01

    Drachsler, H., Hummel, H. G. K., & Koper, R. (2007). Recommendations for learners are different: applying memory-based recommender system techniques to lifelong learning. Paper presented at the SIRTEL workshop at the EC-TEL 2007 Conference. September, 17-20, 2007, Crete, Greece.

  9. Applying Modern Techniques and Carrying Out English .Extracurricular—— On the Model United Nations Activity

    Institute of Scientific and Technical Information of China (English)

    XuXiaoyu; WangJian

    2004-01-01

    This paper is an introduction of the extracurricular activity of the Model United Nations in Northwestern Polyteehnical University (NPU) and it focuses on the application of the modem techniques in the activity and the pedagogical theories applied in it. An interview and questionnaire research will reveal the influence of the Model United Nations.

  10. Applying a Schema for Studying the Instructive Techniques Employed by Authors of Four Novels for Adolescents.

    Science.gov (United States)

    Severin, Mary Susan

    The purpose of this study was to apply a schema to adolescent novels, to determine what lessons the authors teach and what techniques they employ in their teaching. A historical review of literary criticism established a background for interpreting the educational function of literature. A schema of questions based on the historical background was…

  11. Improving operating room efficiency by applying bin-packing and portfolio techniques to surgical case scheduling

    NARCIS (Netherlands)

    Houdenhoven, van M.; Oostrum, van J.M.; Hans, E.W.; Wullink, G.; Kazemier, G.

    2013-01-01

    BACKGROUND: An operating room (OR) department has adopted an efficient business model and subsequently investigated how efficiency could be further improved. The aim of this study is to show the efficiency improvement of lowering organizational barriers and applying advanced mathematical techniques.

  12. Improvement of vibration and noise by applying analysis technology. Development of active control technique of engine noise in a car cabin. Kaiseki gijutsu wo oyoshita shindo-soon no kaizen. Shashitsunai engine soon akutibu seigyo gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    Uchida, H.; Nakao, N.; Butsuen, T. (Matsuda Motor Corp., Hiroshima (Japan). Technology Research Inst.)

    1994-06-01

    It is difficult to reduce engine noise which is principal noise in a car cabin without producing an adverse effect on low cost production. Active noise control technique (ANC) has been developed to reduce engine noise compatible with low cost production. This paper discusses its control algorithm and the system configuration and presents experimental results. The filtered-x least mean square method is a well-known ANC algorithm, however, it often requires large amount of calculation exceeding the present capacity of a digital signal processor. An effective ANC algorithm is developed by the use of the repetitiveness of the engine noise. This paper describes the basic theory of the control algorithm, the extension to a multiple input and output system, the system configuration and experimental results. A noise control system with three microphones is designed with consideration of the spatial distribution of the noise and reduces noise in the whole cabin by 8dB(A) in the largest case. Active noise control technique is applicable to many areas and can be used for the reduction of noise and vibration other than engine noise. 5 refs., 7 figs., 1 tab.

  13. Toward applied behavior analysis of life aloft

    Science.gov (United States)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  14. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    OpenAIRE

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the...

  15. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  16. Digital photoelastic analysis applied to implant dentistry

    Science.gov (United States)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  17. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    Science.gov (United States)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  18. Statistical Techniques Used in Three Applied Linguistics Journals: "Language Learning,""Applied Linguistics" and "TESOL Quarterly," 1980-1986: Implications for Readers and Researchers.

    Science.gov (United States)

    Teleni, Vicki; Baldauf, Richard B., Jr.

    A study investigated the statistical techniques used by applied linguists and reported in three journals, "Language Learning,""Applied Linguistics," and "TESOL Quarterly," between 1980 and 1986. It was found that 47% of the published articles used statistical procedures. In these articles, 63% of the techniques used could be called basic, 28%…

  19. 浅谈中国男篮突破技术在世锦赛三场比赛中的运用%Analysis of Breakthrough Techniques in Three Games Applied by China Men's Basketball in World Championship

    Institute of Scientific and Technical Information of China (English)

    常妙; 佘涛

    2012-01-01

    运用文献资料法、观察法和数理统计等方法,通过对比中国国家男篮与塞内加尔、斯诺文尼亚和波多黎各三支世界强队比赛时个人突破技术的运用情况.主要分析比赛中突破队员利用比赛中观察面、过人的目的、突破速度、动作连贯性、体力、突破时机的掌握、突破空间的掌握、假动作、动作的连贯性、过人的距离等技术动作来完成变速运球过人、身体掩护过人、档拆过人、假动作过人等方式的水平.由此得出结论:个人突破技术在比赛中的作用非常重要.通过分析探索国家男篮在比赛中个人突破能力的运用,发现国家男篮个人突破技术的若干不足,以期达到提高的目的,为国家队男篮提供训练依据.%According to literature material, observation and mathematic statistics and compared 'with such top -ranked teams as Senegal, Slovenia and Puerto Rico, the individual breakthrough techniques are analyzed in detail. Specifically, such skills of dribbles are discussed as variable speed, body screen, pick - and - roll and fake actions concerned with observation of face, pur- poses, breakthrough speed, action consistency, physical strength, breakthrough time, breakthrough space, fake action and the distance. It is concluded that the individual breakthrough techniques are very important during the game and the shortcomings have found to improve the national team and provide the material for training.

  20. Applied research of environmental monitoring using instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju

    1997-08-01

    This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.

  1. Hands on applied finite element analysis application with ANSYS

    CERN Document Server

    Arslan, Mehmet Ali

    2015-01-01

    Hands on Applied Finite Element Analysis Application with Ansys is truly an extraordinary book that offers practical ways of tackling FEA problems in machine design and analysis. In this book, 35 good selection of example problems have been presented, offering students the opportunity to apply their knowledge to real engineering FEA problem solutions by guiding them with real life hands on experience.

  2. Modelling the effects of the sterile insect technique applied to Eldana saccharina Walker in sugarcane

    Directory of Open Access Journals (Sweden)

    L Potgieter

    2012-12-01

    Full Text Available A mathematical model is formulated for the population dynamics of an Eldana saccharina Walker infestation of sugarcane under the influence of partially sterile released insects. The model describes the population growth of and interaction between normal and sterile E.saccharina moths in a temporally variable, but spatially homogeneous environment. The model consists of a deterministic system of difference equations subject to strictly positive initial data. The primary objective of this model is to determine suitable parameters in terms of which the above population growth and interaction may be quantified and according to which E.saccharina infestation levels and the associated sugarcane damage may be measured. Although many models have been formulated in the past describing the sterile insect technique, few of these models describe the technique for Lepidopteran species with more than one life stage and where F1-sterility is relevant. In addition, none of these models consider the technique when fully sterile females and partially sterile males are being released. The model formulated is also the first to describe the technique applied specifically to E.saccharina, and to consider the economic viability of applying the technique to this species. Pertinent decision support is provided to farm managers in terms of the best timing for releases, release ratios and release frequencies.

  3. Phase-shifting technique applied to circular harmonic-based joint transform correlator

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The phase-shifting technique is applied to the circular harmonic expansion-based joint transform correlator. Computer simulation has shown that the light efficiency and the discrimination capability are greatly enhanced, and the full rotation invariance is preserved after the phase-shifting technique has been used. A rotation-invariant optical pattern recognition with high discrimination capability and high light efficiency is obtained. The influence of the additive noise on the performance of the correlator is also investigated. However, the anti-noise capability of this kind of correlator still needs improving.

  4. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  5. Systems design analysis applied to launch vehicle configuration

    Science.gov (United States)

    Ryan, R.; Verderaime, V.

    1993-01-01

    As emphasis shifts from optimum-performance aerospace systems to least lift-cycle costs, systems designs must seek, adapt, and innovate cost improvement techniques in design through operations. The systems design process of concept, definition, and design was assessed for the types and flow of total quality management techniques that may be applicable in a launch vehicle systems design analysis. Techniques discussed are task ordering, quality leverage, concurrent engineering, Pareto's principle, robustness, quality function deployment, criteria, and others. These cost oriented techniques are as applicable to aerospace systems design analysis as to any large commercial system.

  6. Phase-ratio technique as applied to the assessment of lunar surface roughness

    Science.gov (United States)

    Kaydash, Vadym; Videen, Gorden; Shkuratov, Yuriy

    Regoliths of atmosphereless celestial bodies demonstrate prominent light backscattering that is common for particulate surfaces. This occurs over a wide range of phase angles and can be seen in the phase function [1]. The slope of the function may characterize the complexity of planetary surface structure. Imagery of such a parameter suggests that information can be obtained about the surface, like variations of unresolved surface roughness and microtopography [2]. Phase-ratio imagery allows one to characterize the phase function slope. This imagery requires the ratio of two co-registered images acquired at different phase angles. One important advantage of the procedure is that the inherent albedo variations of the surface are suppressed, and, therefore, the resulting image is sensitive to the surface structure variation [2,3]. The phase-ratio image characterizes surface roughness variation at spatial scales on the order of the incident wavelengths to that of the image resolution. Applying the phase-ratio technique to ground-based telescope data has allowed us to find new lunar surface formations in the southern part of Oceanus Procellarum. These are suggested to be weak swirls [4]. We also combined the phase-ratio technique with the space-derived photometry data acquired from the NASA Lunar Reconnaissance Orbiter with high spatial resolution. Thus we exploited the method to analyze the sites of Apollo landings and Soviet sample-return missions. Phase-ratio imagery has revealed anomalies of the phase-curve slope indicating a smoothing of the surface microstructure at the sites caused by dust uplifted by the engine jets of the descent and ascent modules [5,6]. Analysis of phase-ratios helps to understand how the regolith properties have been affected by robotic and human activity on the Moon [7,8]. We have demonstrated the use of the method to search for fresh natural disturbances of surface structure, e.g., to detect areas of fresh slumps, accumulated material on

  7. Multivariate Statistical Analysis Applied in Wine Quality Evaluation

    Directory of Open Access Journals (Sweden)

    Jieling Zou

    2015-08-01

    Full Text Available This study applies multivariate statistical approaches to wine quality evaluation. With 27 red wine samples, four factors were identified out of 12 parameters by principal component analysis, explaining 89.06% of the total variance of data. As iterative weights calculated by the BP neural network revealed little difference from weights determined by information entropy method, the latter was chosen to measure the importance of indicators. Weighted cluster analysis performs well in classifying the sample group further into two sub-clusters. The second cluster of red wine samples, compared with its first, was lighter in color, tasted thinner and had fainter bouquet. Weighted TOPSIS method was used to evaluate the quality of wine in each sub-cluster. With scores obtained, each sub-cluster was divided into three grades. On the whole, the quality of lighter red wine was slightly better than the darker category. This study shows the necessity and usefulness of multivariate statistical techniques in both wine quality evaluation and parameter selection.

  8. 不同去龋技术在乳牙龋病治疗中的应用疗效分析%Analysis of Different Techniques Applied in the Dental Caries Treatment of Deciduous Teeth

    Institute of Scientific and Technical Information of China (English)

    胡静; 陈增力; 刘继延; 王丽英; 赵文峰

    2015-01-01

    Objective:To compare the effect of conventional mechanical caries removal treatment,atraumatic restorative treatment (ART) and Carisolv chemical caries removal treatment applied in the dental caries treatment of primary teeth.Methods:96 pediatric patients diagnosed of mild/moderate dental caries in deciduous teeth from January 2011 to June 2012 were randomly divided into conventional mechanical removal treatment group,atmumatic restorative treatment (ART) group and Carisolv chemical caries removal treatment group with 60 dentals each.Then operative reaction,operation time and long-term efficacy were recorded and analyzed.Results:The efficacy of dental removal in conventional mechanical caries removal group and Carisolv chemical caries removal treatment group were significantly better than ART group (P<0.05).There was no significant difference in the operation time spent among all treatment groups (P>0.05).The number of intraoperative pain occurs in ART group and Carisolv chemical caries removal treatment group were lower than that on the conventional mechanical caries removal group (P<0.05).In one-year post-treatment,secondary caries incidence in conventional mechanical caries removal group and Carisolv chemical caries removal treatment group were significantly lower than that in ART group(P<0.05).However,there was no significant difference in the incidence of break/loss dental filling among all treatment groups (P>0.05).Conclusions:Carisolv chemical caries removal treatment can effectively relieve intmoperative pain and recurrence rate after operation,which is worthy of promotion and application in clinical treatment of deciduous teeth.Caries removal effects and long-term efficacy of ART may limit its widespread application.%目的:比较传统机械切割法、非创伤性充填法(atraumatic restorative treatment,ART)和Carisolv化学法在临床乳牙龋病治疗中的应用效果差异.方法:选取2011年1月-2012年6月来我科就诊的5-8

  9. Comparison of multivariate calibration techniques applied to experimental NIR data sets

    OpenAIRE

    Centner, V; Verdu-Andres, J; Walczak, B; Jouan-Rimbaud, D; Despagne, F; Pasti, L; Poppi, R; Massart, DL; de Noord, OE

    2000-01-01

    The present study compares the performance of different multivariate calibration techniques applied to four near-infrared data sets when test samples are well within the calibration domain. Three types of problems are discussed: the nonlinear calibration, the calibration using heterogeneous data sets, and the calibration in the presence of irrelevant information in the set of predictors. Recommendations are derived from the comparison, which should help to guide a nonchemometrician through th...

  10. Parallelization of events generation for data analysis techniques

    CERN Document Server

    Lazzaro, A

    2010-01-01

    With the startup of the LHC experiments at CERN, the involved community is now focusing on the analysis of the collected data. The complexity of the data analyses will be a key factor for finding eventual new phenomena. For such a reason many data analysis tools have been developed in the last several years, which implement several data analysis techniques. Goal of these techniques is the possibility of discriminating events of interest and measuring parameters on a given input sample of events, which are themselves defined by several variables. Also particularly important is the possibility of repeating the determination of the parameters by applying the procedure on several simulated samples, which are generated using Monte Carlo techniques and the knowledge of the probability density functions of the input variables. This procedure achieves a better estimation of the results. Depending on the number of variables, complexity of their probability density functions, number of events, and number of sample to g...

  11. Mixture experiment techniques for reducing the number of components applied for modeling waste glass sodium release

    Energy Technology Data Exchange (ETDEWEB)

    Piepel, G.; Redgate, T. [Pacific Northwest National Lab., Richland, WA (United States). Statistics Group

    1997-12-01

    Statistical mixture experiment techniques were applied to a waste glass data set to investigate the effects of the glass components on Product Consistency Test (PCT) sodium release (NR) and to develop a model for PCT NR as a function of the component proportions. The mixture experiment techniques indicate that the waste glass system can be reduced from nine to four components for purposes of modeling PCT NR. Empirical mixture models containing four first-order terms and one or two second-order terms fit the data quite well, and can be used to predict the NR of any glass composition in the model domain. The mixture experiment techniques produce a better model in less time than required by another approach.

  12. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  13. Técnicas moleculares aplicadas à microbiologia de alimentos = Molecular techniques applied to food microbiology

    Directory of Open Access Journals (Sweden)

    Eliezer Ávila Gandra

    2008-01-01

    Full Text Available A partir da década de 80, as técnicas moleculares começaram a ser utilizadas como uma alternativa aos métodos fenotípicos, tradicionalmente, utilizados em microbiologia de alimentos. Foi acelerada esta substituição com advento da descoberta da reação em cadeia da polimerase (polymerase chain reaction – PCR. Este artigo tem por objetivo revisar as principais técnicas moleculares utilizadas como ferramentas na microbiologia de alimentos, desde as, inicialmente, desenvolvidas, como a análise do perfil plasmidial, até as mais contemporâneas como o PCR em tempo real, discutindo as características, vantagens e desvantagens destas técnicas, avaliando a potencialidade destas para suprir as limitações das técnicas tradicionais.Beginning in the 1980s, molecular techniques became an alternative to the traditionally used phenotypic methods in food microbiology. With the advent of the polymerase chain reaction technique, this substitution was speed up. This article had as objective to review the main molecular techniques used as tools in food microbiology, from plasmidial profile analysis to contemporary techniques such as the real-time PCR. The characteristics, advantages anddisadvantages of these techniques are discussed, by evaluating the potential of these techniques to overcome the limitations of traditional techniques.

  14. Visualization Techniques Applied to 155-mm Projectile Analysis

    Science.gov (United States)

    2014-11-01

    Magnus characteristics of finned and nonfinned projectiles. AIAA Journal. 1965; 3(1):83–90. 7. Pechier M, Guillen P, Cayzac R, Magnus effect over...of the projectile. The particle paths cross over the fins causing the interaction effects of the canards on the fins. Understanding these...interaction effects is critical to the understanding of the projectile aerodynamics. The use of 3 viewpoints in the animation allows for simultaneous viewing

  15. Animal Research in the "Journal of Applied Behavior Analysis"

    Science.gov (United States)

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  16. Negative Reinforcement in Applied Behavior Analysis: An Emerging Technology.

    Science.gov (United States)

    Iwata, Brian A.

    1987-01-01

    The article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. Current research suggests the emergence of an applied technology on negative reinforcement.…

  17. Applied Thinking for Intelligence Analysis: A Guide for Practitioners

    Directory of Open Access Journals (Sweden)

    Trista M. Bailey

    2015-03-01

    Full Text Available Book Review -- Applied Thinking for Intelligence Analysis: A Guide for Practitioners by Charles Vandepeer, PhD, Air Power Development Centre, Department of Defence, Canberra, Australia, 2014, 106 pages, ISBN 13: 9781925062045, Reviewed by Trista M. Bailey

  18. Applying Discourse Analysis in ELT: a Five Cs Model

    Institute of Scientific and Technical Information of China (English)

    肖巧慧

    2009-01-01

    Based on a discussion of definitions on Discourse analysis,discourse is regard as layers consist of five elements--cohesion, coherence, culture, critique and context. Moreover, we focus on applying DA in ELT.

  19. Wire-mesh and ultrasound techniques applied for the characterization of gas-liquid slug flow

    Energy Technology Data Exchange (ETDEWEB)

    Ofuchi, Cesar Y.; Sieczkowski, Wytila Chagas; Neves Junior, Flavio; Arruda, Lucia V.R.; Morales, Rigoberto E.M.; Amaral, Carlos E.F.; Silva, Marco J. da [Federal University of Technology of Parana, Curitiba, PR (Brazil)], e-mails: ofuchi@utfpr.edu.br, wytila@utfpr.edu.br, neves@utfpr.edu.br, lvrarruda@utfpr.edu.br, rmorales@utfpr.edu.br, camaral@utfpr.edu.br, mdasilva@utfpr.edu.br

    2010-07-01

    Gas-liquid two-phase flows are found in a broad range of industrial applications, such as chemical, petrochemical and nuclear industries and quite often determine the efficiency and safety of process and plants. Several experimental techniques have been proposed and applied to measure and quantify two-phase flows so far. In this experimental study the wire-mesh sensor and an ultrasound technique are used and comparatively evaluated to study two-phase slug flows in horizontal pipes. The wire-mesh is an imaging technique and thus appropriated for scientific studies while ultrasound-based technique is robust and non-intrusive and hence well suited for industrial applications. Based on the measured raw data it is possible to extract some specific slug flow parameters of interest such as mean void fraction and characteristic frequency. The experiments were performed in the Thermal Sciences Laboratory (LACIT) at UTFPR, Brazil, in which an experimental two-phase flow loop is available. The experimental flow loop comprises a horizontal acrylic pipe of 26 mm diameter and 9 m length. Water and air were used to produce the two phase flow under controlled conditions. The results show good agreement between the techniques. (author)

  20. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  1. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  2. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  3. Applied Missing Data Analysis. Methodology in the Social Sciences Series

    Science.gov (United States)

    Enders, Craig K.

    2010-01-01

    Walking readers step by step through complex concepts, this book translates missing data techniques into something that applied researchers and graduate students can understand and utilize in their own research. Enders explains the rationale and procedural details for maximum likelihood estimation, Bayesian estimation, multiple imputation, and…

  4. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  5. Root Cause Analysis - A Diagnostic Failure Analysis Technique for Managers

    Science.gov (United States)

    1975-03-26

    AA~ TECHNICAL REPORT RF-75-2 yAbom 0 ROOT CAUSE ANALYSIS - A DIAGNOSTIC FAILURE ANALYSIS TECHNIQUE FOR MANAGERS Augustine E. Magistro Nuclear...through 1975. rB Augustine E. Magistro has participated in root cause analysis task tem including team member and Blue Ribbon A panel reviewer, team

  6. Software engineering techniques applied to agricultural systems an object-oriented and UML approach

    CERN Document Server

    Papajorgji, Petraq J

    2014-01-01

    Software Engineering Techniques Applied to Agricultural Systems presents cutting-edge software engineering techniques for designing and implementing better agricultural software systems based on the object-oriented paradigm and the Unified Modeling Language (UML). The focus is on the presentation of  rigorous step-by-step approaches for modeling flexible agricultural and environmental systems, starting with a conceptual diagram representing elements of the system and their relationships. Furthermore, diagrams such as sequential and collaboration diagrams are used to explain the dynamic and static aspects of the software system.    This second edition includes: a new chapter on Object Constraint Language (OCL), a new section dedicated to the Model-VIEW-Controller (MVC) design pattern, new chapters presenting details of two MDA-based tools – the Virtual Enterprise and Olivia Nova, and a new chapter with exercises on conceptual modeling.  It may be highly useful to undergraduate and graduate students as t...

  7. Applied methods and techniques for mechatronic systems modelling, identification and control

    CERN Document Server

    Zhu, Quanmin; Cheng, Lei; Wang, Yongji; Zhao, Dongya

    2014-01-01

    Applied Methods and Techniques for Mechatronic Systems brings together the relevant studies in mechatronic systems with the latest research from interdisciplinary theoretical studies, computational algorithm development and exemplary applications. Readers can easily tailor the techniques in this book to accommodate their ad hoc applications. The clear structure of each paper, background - motivation - quantitative development (equations) - case studies/illustration/tutorial (curve, table, etc.) is also helpful. It is mainly aimed at graduate students, professors and academic researchers in related fields, but it will also be helpful to engineers and scientists from industry. Lei Liu is a lecturer at Huazhong University of Science and Technology (HUST), China; Quanmin Zhu is a professor at University of the West of England, UK; Lei Cheng is an associate professor at Wuhan University of Science and Technology, China; Yongji Wang is a professor at HUST; Dongya Zhao is an associate professor at China University o...

  8. Mathematical Model and Artificial Intelligent Techniques Applied to a Milk Industry through DSM

    Science.gov (United States)

    Babu, P. Ravi; Divya, V. P. Sree

    2011-08-01

    The resources for electrical energy are depleting and hence the gap between the supply and the demand is continuously increasing. Under such circumstances, the option left is optimal utilization of available energy resources. The main objective of this chapter is to discuss about the Peak load management and overcome the problems associated with it in processing industries such as Milk industry with the help of DSM techniques. The chapter presents a generalized mathematical model for minimizing the total operating cost of the industry subject to the constraints. The work presented in this chapter also deals with the results of application of Neural Network, Fuzzy Logic and Demand Side Management (DSM) techniques applied to a medium scale milk industrial consumer in India to achieve the improvement in load factor, reduction in Maximum Demand (MD) and also the consumer gets saving in the energy bill.

  9. Applying Data Mining Techniques to Improve Information Security in the Cloud: A Single Cache System Approach

    Directory of Open Access Journals (Sweden)

    Amany AlShawi

    2016-01-01

    Full Text Available Presently, the popularity of cloud computing is gradually increasing day by day. The purpose of this research was to enhance the security of the cloud using techniques such as data mining with specific reference to the single cache system. From the findings of the research, it was observed that the security in the cloud could be enhanced with the single cache system. For future purposes, an Apriori algorithm can be applied to the single cache system. This can be applied by all cloud providers, vendors, data distributors, and others. Further, data objects entered into the single cache system can be extended into 12 components. Database and SPSS modelers can be used to implement the same.

  10. Geophysical techniques applied to urban planning in complex near surface environments. Examples of Zaragoza, NE Spain

    Science.gov (United States)

    Pueyo-Anchuela, Ó.; Casas-Sainz, A. M.; Soriano, M. A.; Pocoví-Juan, A.

    Complex geological shallow subsurface environments represent an important handicap in urban and building projects. The geological features of the Central Ebro Basin, with sharp lateral changes in Quaternary deposits, alluvial karst phenomena and anthropic activity can preclude the characterization of future urban areas only from isolated geomechanical tests or from non-correctly dimensioned geophysical techniques. This complexity is here analyzed in two different test fields, (i) one of them linked to flat-bottomed valleys with irregular distribution of Quaternary deposits related to sharp lateral facies changes and irregular preconsolidated substratum position and (ii) a second one with similar complexities in the alluvial deposits and karst activity linked to solution of the underlying evaporite substratum. The results show that different geophysical techniques allow for similar geological models to be obtained in the first case (flat-bottomed valleys), whereas only the application of several geophysical techniques can permit to correctly evaluate the geological model complexities in the second case (alluvial karst). In this second case, the geological and superficial information permit to refine the sensitivity of the applied geophysical techniques to different indicators of karst activity. In both cases 3D models are needed to correctly distinguish alluvial lateral sedimentary changes from superimposed karstic activity.

  11. Evaluation of hippocampal volume based on MRI applying manual and automatic segmentation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Doring, Thomas M.; Gasparetto, Emerson L. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil); Kubo, Tadeu T.A.; Domingues, Romeu C. [Clinica de Diagnostico por Imagem (CDPI), Rio de Janeiro, RJ (Brazil)

    2010-03-15

    Various segmentation techniques using MR sequences, including manual and automatic protocols, have been developed to optimize the determination of the hippocampal volume. For clinical application, automated methods with high reproducibility and accuracy potentially may be more efficient than manual volumetry. This study aims to compare the hippocampal volumes obtained from manual and automatic segmentation methods (FreeSurfer and FSL). The automatic segmentation method FreeSurfer showed high correlation. Comparing the absolute hippocampal volumes, there is an overestimation by the automated methods. Applying a correction factor to the automatic method, it may be an alternative for the estimation of the absolute hippocampal volume. (author)

  12. Energy saving techniques applied over a nation-wide mobile network

    DEFF Research Database (Denmark)

    Perez, Eva; Frank, Philipp; Micallef, Gilbert;

    2014-01-01

    Traffic carried over wireless networks has grown significantly in recent years and actual forecasts show that this trend is expected to continue. However, the rapid mobile data explosion and the need for higher data rates comes at a cost of increased complexity and energy consumption of the mobile...... on the energy consumption based on a nation-wide network of a leading European operator. By means of an extensive analysis, we show that with the proposed techniques significant energy savings can be realized....... networks. Although base station equipment is improving its energy efficiency by means of new power amplifiers and increased processing power, additional techniques are required to further reduce the energy consumption. In this paper, we evaluate different energy saving techniques and study their impact...

  13. An analysis technique for microstrip antennas

    Science.gov (United States)

    Agrawal, P. K.; Bailey, M. C.

    1977-01-01

    The paper presents a combined numerical and empirical approach to the analysis of microstrip antennas over a wide range of frequencies. The method involves representing the antenna by a fine wire grid immersed in a dielectric medium and then using Richmond's reaction formulation (1974) to evaluate the piecewise sinusoidal currents on the grid segments. The calculated results are then modified to account for the finite dielectric discontinuity. The method is applied to round and square microstrip antennas.

  14. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  15. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  16. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    of a given process requires analysis of the underlying mechanisms, at best, at the molecular level. To reveal these mechanisms a number of different techniques may be applied: (1) detailed physiological studies, (2) metabolic flux analysis (MFA), (3) metabolic control analysis (MCA), (4) thermodynamic......Metabolic engineering has been defined as the purposeful modification of intermediary metabolism using recombinant DNA techniques. With this definition metabolic engineering includes: (1) inserting new pathways in microorganisms with the aim of producing novel metabolites, e.g., production...... of polyketides by Streptomyces; (2) production of heterologous peptides, e.g., production of human insulin, erythropoitin, and tPA; and (3) improvement of both new and existing processes, e.g., production of antibiotics and industrial enzymes. Metabolic engineering is a multidisciplinary approach, which involves...

  17. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  18. Pulsed remote field eddy current technique applied to non-magnetic flat conductive plates

    Science.gov (United States)

    Yang, Binfeng; Zhang, Hui; Zhang, Chao; Zhang, Zhanbin

    2013-12-01

    Non-magnetic metal plates are widely used in aviation and industrial applications. The detection of cracks in thick plate structures, such as multilayered structures of aircraft fuselage, has been challenging in nondestructive evaluation societies. The remote field eddy current (RFEC) technique has shown advantages of deep penetration and high sensitivity to deeply buried anomalies. However, the RFEC technique is mainly used to evaluate ferromagnetic tubes. There are many problems that should be fixed before the expansion and application of this technique for the inspection of non-magnetic conductive plates. In this article, the pulsed remote field eddy current (PRFEC) technique for the detection of defects in non-magnetic conducting plates was investigated. First, the principle of the PRFEC technique was analysed, followed by the analysis of the differences between the detection of defects in ferromagnetic and non-magnetic plain structures. Three different models of the PRFEC probe were simulated using ANSYS. The location of the transition zone, defect detection sensitivity and the ability to detect defects in thick plates using three probes were analysed and compared. The simulation results showed that the probe with a ferrite core had the highest detecting ability. The conclusions derived from the simulation study were also validated by conducting experiments.

  19. Signed directed social network analysis applied to group conflict

    DEFF Research Database (Denmark)

    Zheng, Quan; Skillicorn, David; Walther, Olivier

    2015-01-01

    Real-world social networks contain relationships of multiple different types, but this richness is often ignored in graph-theoretic modelling. We show how two recently developed spectral embedding techniques, for directed graphs (relationships are asymmetric) and for signed graphs (relationships...... are both positive and negative), can be combined. This combination is particularly appropriate for intelligence, terrorism, and law enforcement applications. We illustrate by applying the novel embedding technique to datasets describing conflict in North-West Africa, and show how unusual interactions can...

  20. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  1. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  2. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  3. Negative reinforcement in applied behavior analysis: an emerging technology.

    OpenAIRE

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these area...

  4. UPLC: a preeminent technique in pharmaceutical analysis.

    Science.gov (United States)

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  5. A Comparative Analysis of Biomarker Selection Techniques

    Directory of Open Access Journals (Sweden)

    Nicoletta Dessì

    2013-01-01

    Full Text Available Feature selection has become the essential step in biomarker discovery from high-dimensional genomics data. It is recognized that different feature selection techniques may result in different set of biomarkers, that is, different groups of genes highly correlated to a given pathological condition, but few direct comparisons exist which quantify these differences in a systematic way. In this paper, we propose a general methodology for comparing the outcomes of different selection techniques in the context of biomarker discovery. The comparison is carried out along two dimensions: (i measuring the similarity/dissimilarity of selected gene sets; (ii evaluating the implications of these differences in terms of both predictive performance and stability of selected gene sets. As a case study, we considered three benchmarks deriving from DNA microarray experiments and conducted a comparative analysis among eight selection methods, representatives of different classes of feature selection techniques. Our results show that the proposed approach can provide useful insight about the pattern of agreement of biomarker discovery techniques.

  6. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  7. Micropillar compression technique applied to micron-scale mudstone elasto-plastic deformation.

    Energy Technology Data Exchange (ETDEWEB)

    Michael, Joseph Richard; Chidsey, Thomas (Utah Geological Survey, Salt Lake City, UT); Heath, Jason E.; Dewers, Thomas A.; Boyce, Brad Lee; Buchheit, Thomas Edward

    2010-12-01

    Mudstone mechanical testing is often limited by poor core recovery and sample size, preservation and preparation issues, which can lead to sampling bias, damage, and time-dependent effects. A micropillar compression technique, originally developed by Uchic et al. 2004, here is applied to elasto-plastic deformation of small volumes of mudstone, in the range of cubic microns. This study examines behavior of the Gothic shale, the basal unit of the Ismay zone of the Pennsylvanian Paradox Formation and potential shale gas play in southeastern Utah, USA. Precision manufacture of micropillars 5 microns in diameter and 10 microns in length are prepared using an ion-milling method. Characterization of samples is carried out using: dual focused ion - scanning electron beam imaging of nano-scaled pores and distribution of matrix clay and quartz, as well as pore-filling organics; laser scanning confocal (LSCM) 3D imaging of natural fractures; and gas permeability, among other techniques. Compression testing of micropillars under load control is performed using two different nanoindenter techniques. Deformation of 0.5 cm in diameter by 1 cm in length cores is carried out and visualized by a microscope loading stage and laser scanning confocal microscopy. Axisymmetric multistage compression testing and multi-stress path testing is carried out using 2.54 cm plugs. Discussion of results addresses size of representative elementary volumes applicable to continuum-scale mudstone deformation, anisotropy, and size-scale plasticity effects. Other issues include fabrication-induced damage, alignment, and influence of substrate.

  8. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  9. Review of Intelligent Techniques Applied for Classification and Preprocessing of Medical Image Data

    Directory of Open Access Journals (Sweden)

    H S Hota

    2013-01-01

    Full Text Available Medical image data like ECG, EEG and MRI, CT-scan images are the most important way to diagnose disease of human being in precise way and widely used by the physician. Problem can be clearly identified with the help of these medical images. A robust model can classify the medical image data in better way .In this paper intelligent techniques like neural network and fuzzy logic techniques are explored for MRI medical image data to identify tumor in human brain. Also need of preprocessing of medical image data is explored. Classification technique has been used extensively in the field of medical imaging. The conventional method in medical science for medical image data classification is done by human inspection which may result misclassification of data sometime this type of problem identification are impractical for large amounts of data and noisy data, a noisy data may be produced due to some technical fault of the machine or by human errors and can lead misclassification of medical image data. We have collected number of papers based on neural network and fuzzy logic along with hybrid technique to explore the efficiency and robustness of the model for brain MRI data. It has been analyzed that intelligent model along with data preprocessing using principal component analysis (PCA and segmentation may be the competitive model in this domain.

  10. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  11. Archaeometry: nuclear and conventional techniques applied to the archaeological research; Arqueometria: tecnicas nucleares y convencionales aplicadas a la investigacion arqueologica

    Energy Technology Data Exchange (ETDEWEB)

    Esparza L, R.; Cardenas G, E. (ed.)

    2005-07-01

    The book that now is presented is formed by twelve articles that approach from different perspective topics as the archaeological prospecting, the analysis of the pre hispanic and colonial ceramic, the obsidian and the mural painting, besides dating and questions about the data ordaining. Following the chronological order in which the exploration techniques and laboratory studies are required, there are presented in the first place the texts about the systematic and detailed study of the archaeological sites, later we pass to relative topics to the application of diverse nuclear techniques as PIXE, RBS, XRD, NAA, SEM, Moessbauer spectroscopy and other conventional techniques. The multidisciplinary is an aspect that highlights in this work, that which owes to the great specialization of the work that is presented even in the archaeological studies including in the open ground of the topography, mapping, excavation and, of course, in the laboratory tests. Most of the articles are the result of several years of investigation and it has been consigned in the responsibility of each article. The texts here gathered emphasize the technical aspects of each investigation, the modern compute systems applied to the prospecting and the archaeological mapping, the chemical and physical analysis of organic materials, of metal artifacts, of diverse rocks used in the pre hispanic epoch, of mural and ceramic paintings, characteristics that justly underline the potential of the collective works. (Author)

  12. Comparative Analysis of Hand Gesture Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Arpana K. Patel

    2015-03-01

    Full Text Available During past few years, human hand gesture for interaction with computing devices has continues to be active area of research. In this paper survey of hand gesture recognition is provided. Hand Gesture Recognition is contained three stages: Pre-processing, Feature Extraction or matching and Classification or recognition. Each stage contains different methods and techniques. In this paper define small description of different methods used for hand gesture recognition in existing system with comparative analysis of all method with its benefits and drawbacks are provided.

  13. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  14. Strategy for applying scaling technique to water retention curves of forest soils

    Science.gov (United States)

    Hayashi, Y.; Kosugi, K.; Mizuyama, T.

    2009-12-01

    Describing the infiltration of water in soils on a forested hillslope requires the information of spatial variability of water retention curve (WRC). By using a scaling technique, Hayashi et al. (2009), found that the porosity mostly characterizes the spatial variability of the WRCs on a forested hillslope. This scaling technique was based on a model, which assumes a lognormal pore size distribution and contains three parameters: the median of log-transformed pore radius, ψm, the variance of log-transformed pore radius, σ, and the effective porosity, θe. Thus, in the scaling method proposed by Hayashi et al. (2009), θe is a scaling factor, which should be determined for each individual soil, and that ψm and σ are reference parameter common for the whole data set. They examined this scaling method using θe calculated as a difference between the observed saturated water content and water content observed at ψ = -1000 cm for each sample and, ψm and σ derived from the whole data set of WRCs on the slope. Then it was showed that this scaling method could explain almost 90 % of the spatial variability in WRCs on the forested hillslope. However, this method requires the whole data set of WRCs for deriving the reference parameters (ψm and σ). For applying the scaling technique more practically, in this study, we tested a scaling method using the reference parameter derived from the WRCs at a small part of the slope. In order to examine the proposed scaling method, the WRCs for the 246 undisturbed forest soil samples, collected at 15 points distributed from downslope to upslope segments, were observed. In the proposed scaling method, we optimized the common ψm and σ to the WRCs for six soil samples, collected at one point on the middle-slope, and applied these parameters to a reference parameter for the whole data sets. The scaling method proposed by this study exhibited an increase of only 6 % in the residual sum of squares as compared with that of the method

  15. Uncertainty Analysis Technique for OMEGA Dante Measurements

    Energy Technology Data Exchange (ETDEWEB)

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  16. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  17. New Region Growing based on Thresholding Technique Applied to MRI Data

    Directory of Open Access Journals (Sweden)

    A. Afifi

    2015-06-01

    Full Text Available This paper proposes an optimal region growing threshold for the segmentation of magnetic resonance images (MRIs. The proposed algorithm combines local search procedure with thresholding region growing to achieve better generic seeds and optimal thresholds for region growing method. A procedure is used to detect the best possible seeds from a set of data distributed all over the image as a high accumulator of the histogram. The output seeds are fed to the local search algorithm to extract the best seeds around initial seeds. Optimal thresholds are used to overcome the limitations of region growing algorithm and to select the pixels sequentially in a random walk starting at the seed point. The proposed algorithm works automatically without any predefined parameters. The proposed algorithm is applied to the challenging application "gray matter/white matter" segmentation datasets. The experimental results compared with other segmentation techniques show that the proposed algorithm produces more accurate and stable results.

  18. Electron Correlation Microscopy: A New Technique for Studying Local Atom Dynamics Applied to a Supercooled Liquid.

    Science.gov (United States)

    He, Li; Zhang, Pei; Besser, Matthew F; Kramer, Matthew Joseph; Voyles, Paul M

    2015-08-01

    Electron correlation microscopy (ECM) is a new technique that utilizes time-resolved coherent electron nanodiffraction to study dynamic atomic rearrangements in materials. It is the electron scattering equivalent of photon correlation spectroscopy with the added advantage of nanometer-scale spatial resolution. We have applied ECM to a Pd40Ni40P20 metallic glass, heated inside a scanning transmission electron microscope into a supercooled liquid to measure the structural relaxation time τ between the glass transition temperature T g and the crystallization temperature, T x . τ determined from the mean diffraction intensity autocorrelation function g 2(t) decreases with temperature following an Arrhenius relationship between T g and T g +25 K, and then increases as temperature approaches T x . The distribution of τ determined from the g 2(t) of single speckles is broad and changes significantly with temperature.

  19. Solar coronal magnetic fields derived using seismology techniques applied to omnipresent sunspot waves

    CERN Document Server

    Jess, D B; Ryans, R S I; Christian, D J; Keys, P H; Mathioudakis, M; Mackay, D H; Prasad, S Krishna; Banerjee, D; Grant, S D T; Yau, S; Diamond, C

    2016-01-01

    Sunspots on the surface of the Sun are the observational signatures of intense manifestations of tightly packed magnetic field lines, with near-vertical field strengths exceeding 6,000 G in extreme cases. It is well accepted that both the plasma density and the magnitude of the magnetic field strength decrease rapidly away from the solar surface, making high-cadence coronal measurements through traditional Zeeman and Hanle effects difficult since the observational signatures are fraught with low-amplitude signals that can become swamped with instrumental noise. Magneto-hydrodynamic (MHD) techniques have previously been applied to coronal structures, with single and spatially isolated magnetic field strengths estimated as 9-55 G. A drawback with previous MHD approaches is that they rely on particular wave modes alongside the detectability of harmonic overtones. Here we show, for the first time, how omnipresent magneto-acoustic waves, originating from within the underlying sunspot and propagating radially outwa...

  20. A systematic review of applying modern software engineering techniques to developing robotic systems

    Directory of Open Access Journals (Sweden)

    Claudia Pons

    2012-04-01

    Full Text Available Robots have become collaborators in our daily life. While robotic systems become more and more complex, the need to engineer their software development grows as well. The traditional approaches used in developing these software systems are reaching their limits; currently used methodologies and tools fall short of addressing the needs of such complex software development. Separating robotics’ knowledge from short-cycled implementation technologies is essential to foster reuse and maintenance. This paper presents a systematic review (SLR of the current use of modern software engineering techniques for developing robotic software systems and their actual automation level. The survey was aimed at summarizing existing evidence concerning applying such technologies to the field of robotic systems to identify any gaps in current research to suggest areas for further investigation and provide a background for positioning new research activities.

  1. Applying Multi-Criteria Decision-Making Techniques to Prioritize Agility Drivers

    Directory of Open Access Journals (Sweden)

    Ahmad Jafarnejad

    2013-07-01

    Full Text Available It seems that to recognize and classify the factors affecting organizational agility and need to specify the amount of their importance for the organization is essential to preserve survival and success in today's environment. This paper reviews the concept of agility and its division in the following indicators included the factors of motivations organizational agility that have been ranked in terms of level of importance and their influence by the techniques of MCDM. The inner complexity, suppliers, competition, customer needs, market, technology and social factors are the most important factors affecting organizational agility that can evaluate the following indicators and apply them and re-engineering processes, reviews and predictions of customer needs and better understanding of competitive environment and supply chain specify organizational agility and success ultimately.

  2. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  3. Application of Electromigration Techniques in Environmental Analysis

    Science.gov (United States)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  4. Full-field speckle correlation technique as applied to blood flow monitoring

    Science.gov (United States)

    Vilensky, M. A.; Agafonov, D. N.; Timoshina, P. A.; Shipovskaya, O. V.; Zimnyakov, D. A.; Tuchin, V. V.; Novikov, P. A.

    2011-03-01

    The results of experimental study of monitoring the microcirculation in tissue superficial layers of the internal organs at gastro-duodenal hemorrhage with the use of laser speckles contrast analysis technique are presented. The microcirculation monitoring was provided in the course of the laparotomy of rat abdominal cavity in the real time. Microscopic hemodynamics was analyzed for small intestine and stomach under different conditions (normal state, provoked ischemia, administration of vasodilative agents such as papaverine, lidocaine). The prospects and problems of internal monitoring of micro-vascular flow in clinical conditions are discussed.

  5. Micro-spectroscopic techniques applied to characterization of varnished archeological findings

    Science.gov (United States)

    Barone, G.; Ioppolo, S.; Majolino, D.; Migliardo, P.; Ponterio, R.

    2000-04-01

    This work reports an analysis on terracotta varnished finding recovered in east Sicily area (Messina). We have performed FTIR micro-spectroscopy and electronic microscopy (SEM)measurements in order to recognize the elemental constituents of the varnished surfaces. Furthermore, for all the samples, a study on the bulk has been performed by Fourier Transform Infrared Absorption. The analyzed samples consist of a number of pottery fragments belonging to archaic and classical ages, varnished in black and red colors. The obtained data furnished useful information about composition of decorated surfaces and bulk matrixes, about baking temperature, manufacture techniques and alteration mechanisms of findings due to the long burial.

  6. Imaging techniques applied to the study of fluids in porous media

    Energy Technology Data Exchange (ETDEWEB)

    Tomutsa, L.; Doughty, D.; Brinkmeyer, A.; Mahmood, S.

    1992-06-01

    Improved imaging techniques were used to study the dynamics of fluid flow and trapping at various scales in porous media. Two-phase and three-phase floods were performed and monitored by computed tomography (CT) scanning and/or nuclear magnetic resonance imaging (NMRI) microscopy. Permeability-porosity correlations obtained from image analysis were combined with porosity distributions from CT scanning to generate spatial permeability distributions within the core which were used in simulations of two-phase floods. Simulation-derived saturation distributions of two-phase processes showed very good agreement with the CT measured values.

  7. VIDEOGRAMMETRIC RECONSTRUCTION APPLIED TO VOLCANOLOGY: PERSPECTIVES FOR A NEW MEASUREMENT TECHNIQUE IN VOLCANO MONITORING

    Directory of Open Access Journals (Sweden)

    Emmanuelle Cecchi

    2011-05-01

    Full Text Available This article deals with videogrammetric reconstruction of volcanic structures. As a first step, the method is tested in laboratory. The objective is to reconstruct small sand and plaster cones, analogous to volcanoes, that deform with time. The initial stage consists in modelling the sensor (internal parameters and calculating its orientation and position in space, using a multi-view calibration method. In practice two sets of views are taken: a first one around a calibration target and a second one around the studied object. Both sets are combined in the calibration software to simultaneously compute the internal parameters modelling the sensor, and the external parameters giving the spatial location of each view around the cone. Following this first stage, a N-view reconstruction process is carried out. The principle is as follows: an initial 3D model of the cone is created and then iteratively deformed to fit the real object. The deformation of the meshed model is based on a texture coherence criterion. At present, this reconstruction method and its precision are being validated at laboratory scale. The objective will be then to follow analogue model deformation with time using successive reconstructions. In the future, the method will be applied to real volcanic structures. Modifications of the initial code will certainly be required, however excellent reconstruction accuracy, valuable simplicity and flexibility of the technique are expected, compared to classic stereophotogrammetric techniques used in volcanology.

  8. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jeffs, S.P., E-mail: s.p.jeffs@swansea.ac.uk [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Lancaster, R.J. [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Garcia, T.E. [IUTA (University Institute of Industrial Technology of Asturias), University of Oviedo, Edificio Departamental Oeste 7.1.17, Campus Universitario, 33203 Gijón (Spain)

    2015-06-11

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k{sub SP} method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results.

  9. The Double Layer Methodology and the Validation of Eigenbehavior Techniques Applied to Lifestyle Modeling

    Science.gov (United States)

    Lamichhane, Bishal

    2017-01-01

    A novel methodology, the double layer methodology (DLM), for modeling an individual's lifestyle and its relationships with health indicators is presented. The DLM is applied to model behavioral routines emerging from self-reports of daily diet and activities, annotated by 21 healthy subjects over 2 weeks. Unsupervised clustering on the first layer of the DLM separated our population into two groups. Using eigendecomposition techniques on the second layer of the DLM, we could find activity and diet routines, predict behaviors in a portion of the day (with an accuracy of 88% for diet and 66% for activity), determine between day and between individual similarities, and detect individual's belonging to a group based on behavior (with an accuracy up to 64%). We found that clustering based on health indicators was mapped back into activity behaviors, but not into diet behaviors. In addition, we showed the limitations of eigendecomposition for lifestyle applications, in particular when applied to noisy and sparse behavioral data such as dietary information. Finally, we proposed the use of the DLM for supporting adaptive and personalized recommender systems for stimulating behavior change. PMID:28133607

  10. An acceleration technique for the Gauss-Seidel method applied to symmetric linear systems

    Directory of Open Access Journals (Sweden)

    Jesús Cajigas

    2014-06-01

    Full Text Available A preconditioning technique to improve the convergence of the Gauss-Seidel method applied to symmetric linear systems while preserving symmetry is proposed. The preconditioner is of the form I + K and can be applied an arbitrary number of times. It is shown that under certain conditions the application of the preconditioner a finite number of steps reduces the matrix to a diagonal. A series of numerical experiments using matrices from spatial discretizations of partial differential equations demonstrates that both versions of the preconditioner, point and block version, exhibit lower iteration counts than its non-symmetric version. Resumen. Se propone una técnica de precondicionamiento para mejorar la convergencia del método Gauss-Seidel aplicado a sistemas lineales simétricos pero preservando simetría. El precondicionador es de la forma I + K y puede ser aplicado un número arbitrario de veces. Se demuestra que bajo ciertas condiciones la aplicación del precondicionador un número finito de pasos reduce la matriz del sistema precondicionado a una diagonal. Una serie de experimentos con matrices que provienen de la discretización de ecuaciones en derivadas parciales muestra que ambas versiones del precondicionador, por punto y por bloque, muestran un menor número de iteraciones en comparación con la versión que no preserva simetría.

  11. Micropillar Compression Technique Applied to Micron-Scale Mudstone Elasto-Plastic Deformation

    Science.gov (United States)

    Dewers, T. A.; Boyce, B.; Buchheit, T.; Heath, J. E.; Chidsey, T.; Michael, J.

    2010-12-01

    Mudstone mechanical testing is often limited by poor core recovery and sample size, preservation and preparation issues, which can lead to sampling bias, damage, and time-dependent effects. A micropillar compression technique, originally developed by Uchic et al. 2004, here is applied to elasto-plastic deformation of small volumes of mudstone, in the range of cubic microns. This study examines behavior of the Gothic shale, the basal unit of the Ismay zone of the Pennsylvanian Paradox Formation and potential shale gas play in southeastern Utah, USA. Precision manufacture of micropillars 5 microns in diameter and 10 microns in length are prepared using an ion-milling method. Characterization of samples is carried out using: dual focused ion - scanning electron beam imaging of nano-scaled pores and distribution of matrix clay and quartz, as well as pore-filling organics; laser scanning confocal (LSCM) 3D imaging of natural fractures; and gas permeability, among other techniques. Compression testing of micropillars under load control is performed using two different nanoindenter techniques. Deformation of 0.5 cm in diameter by 1 cm in length cores is carried out and visualized by a microscope loading stage and laser scanning confocal microscopy. Axisymmetric multistage compression testing and multi-stress path testing is carried out using 2.54 cm plugs. Discussion of results addresses size of representative elementary volumes applicable to continuum-scale mudstone deformation, anisotropy, and size-scale plasticity effects. Other issues include fabrication-induced damage, alignment, and influence of substrate. This work is funded by the US Department of Energy, Office of Basic Energy Sciences. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy’s National Nuclear Security Administration under contract DE-AC04-94AL85000.

  12. Bioclimatic and vegetation mapping of a topographically complex oceanic island applying different interpolation techniques.

    Science.gov (United States)

    Garzón-Machado, Víctor; Otto, Rüdiger; del Arco Aguilar, Marcelino José

    2014-07-01

    Different spatial interpolation techniques have been applied to construct objective bioclimatic maps of La Palma, Canary Islands. Interpolation of climatic data on this topographically complex island with strong elevation and climatic gradients represents a challenge. Furthermore, meteorological stations are not evenly distributed over the island, with few stations at high elevations. We carried out spatial interpolations of the compensated thermicity index (Itc) and the annual ombrothermic Index (Io), in order to obtain appropriate bioclimatic maps by using automatic interpolation procedures, and to establish their relation to potential vegetation units for constructing a climatophilous potential natural vegetation map (CPNV). For this purpose, we used five interpolation techniques implemented in a GIS: inverse distance weighting (IDW), ordinary kriging (OK), ordinary cokriging (OCK), multiple linear regression (MLR) and MLR followed by ordinary kriging of the regression residuals. Two topographic variables (elevation and aspect), derived from a high-resolution digital elevation model (DEM), were included in OCK and MLR. The accuracy of the interpolation techniques was examined by the results of the error statistics of test data derived from comparison of the predicted and measured values. Best results for both bioclimatic indices were obtained with the MLR method with interpolation of the residuals showing the highest R2 of the regression between observed and predicted values and lowest values of root mean square errors. MLR with correction of interpolated residuals is an attractive interpolation method for bioclimatic mapping on this oceanic island since it permits one to fully account for easily available geographic information but also takes into account local variation of climatic data.

  13. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching.

    Science.gov (United States)

    Joyce, B; Moxley, R A

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis.

  14. 2D and 3D optical diagnostic techniques applied to Madonna dei Fusi by Leonardo da Vinci

    Science.gov (United States)

    Fontana, R.; Gambino, M. C.; Greco, M.; Marras, L.; Materazzi, M.; Pampaloni, E.; Pelagotti, A.; Pezzati, L.; Poggi, P.; Sanapo, C.

    2005-06-01

    3D measurement and modelling have been traditionally applied to statues, buildings, archeological sites or similar large structures, but rarely to paintings. Recently, however, 3D measurements have been performed successfully also on easel paintings, allowing to detect and document the painting's surface. We used 3D models to integrate the results of various 2D imaging techniques on a common reference frame. These applications show how the 3D shape information, complemented with 2D colour maps as well as with other types of sensory data, provide the most interesting information. The 3D data acquisition was carried out by means of two devices: a high-resolution laser micro-profilometer, composed of a commercial distance meter mounted on a scanning device, and a laser-line scanner. The 2D data acquisitions were carried out using a scanning device for simultaneous RGB colour imaging and IR reflectography, and a UV fluorescence multispectral image acquisition system. We present here the results of the techniques described, applied to the analysis of an important painting of the Italian Reinassance: `Madonna dei Fusi', attributed to Leonardo da Vinci.

  15. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  16. APPLYING OF GAS ANALYSIS IN DIAGNOSIS OF BRONCHOPULMONARY DISEASES

    Directory of Open Access Journals (Sweden)

    Ye. B. Bukreyeva

    2014-01-01

    Full Text Available Bronchopulmonary system diseases are on the first place among the causes of people's death. Most of methods for lung diseases diagnosis are invasive or not suitable for children and patients with severe disease. One of the promising methods of clinical diagnosis and disease activity monitoring of bronchopulmonary system is analyzing of human breath. Directly exhaled breath or exhaled breath condensate are using for human breaths analyzing. Analysis of human breath can apply for diagnostic, long monitoring and evaluation of efficacy of the treatment bronchopulmonary diseases. Differential diagnostic between chronic obstructive lung disease (COPD and bronchial asthma is complicated because they have differences in pathogenesis. Analysis of human breath allows to explore features of COPD and bronchial asthma and to improve differential diagnostic of these diseases. Human breaths analyzing can apply for diagnostic dangerous diseases, such as tuberculosis, lung cancer. The analysis of breath air by spectroscopy methods is new noninvasive way for diagnosis of bronchopulmonary diseases.

  17. Predicting Performance of Schools by Applying Data Mining Techniques on Public Examination Results

    Directory of Open Access Journals (Sweden)

    J. Macklin Abraham Navamani

    2015-02-01

    Full Text Available This study work presents a systematic analysis of various features of the higher grade school public examination results data in the state of Tamil Nadu, India through different data mining classification algorithms to predict the performance of Schools. Nowadays the parents always targets to select the right city, school and factors which contributes to the success of the results in schools of their children. There could be possible effects of factors such as Ethnic mix, Medium of study, geography could make a difference in results. The proposed work would focus on two fold factors namely Machine Learning algorithms to predict School performance with satisfying accuracy and to evaluate the data mining technique which would give better accuracy of the learning algorithms. It was found that there exist some apparent and some less noticeable attributes that demonstrate a strong correlation with student performance. Data were collected through the credible source data preparation and correlation analysis. The findings revealed that the public examinations results data was a very helpful predictor of performance of school in order to improve the result with maximum level and also improved the overall accuracy with the help of Adaboost technique.

  18. Vibrational techniques applied to photosynthesis: Resonance Raman and fluorescence line-narrowing.

    Science.gov (United States)

    Gall, Andrew; Pascal, Andrew A; Robert, Bruno

    2015-01-01

    Resonance Raman spectroscopy may yield precise information on the conformation of, and the interactions assumed by, the chromophores involved in the first steps of the photosynthetic process. Selectivity is achieved via resonance with the absorption transition of the chromophore of interest. Fluorescence line-narrowing spectroscopy is a complementary technique, in that it provides the same level of information (structure, conformation, interactions), but in this case for the emitting pigment(s) only (whether isolated or in an ensemble of interacting chromophores). The selectivity provided by these vibrational techniques allows for the analysis of pigment molecules not only when they are isolated in solvents, but also when embedded in soluble or membrane proteins and even, as shown recently, in vivo. They can be used, for instance, to relate the electronic properties of these pigment molecules to their structure and/or the physical properties of their environment. These techniques are even able to follow subtle changes in chromophore conformation associated with regulatory processes. After a short introduction to the physical principles that govern resonance Raman and fluorescence line-narrowing spectroscopies, the information content of the vibrational spectra of chlorophyll and carotenoid molecules is described in this article, together with the experiments which helped in determining which structural parameter(s) each vibrational band is sensitive to. A selection of applications is then presented, in order to illustrate how these techniques have been used in the field of photosynthesis, and what type of information has been obtained. This article is part of a Special Issue entitled: Vibrational spectroscopies and bioenergetic systems.

  19. Improving Credit Scorecard Modeling Through Applying Text Analysis

    Directory of Open Access Journals (Sweden)

    Omar Ghailan

    2016-04-01

    Full Text Available In the credit card scoring and loans management, the prediction of the applicant’s future behavior is an important decision support tool and a key factor in reducing the risk of Loan Default. A lot of data mining and classification approaches have been developed for the credit scoring purpose. For the best of our knowledge, building a credit scorecard by analyzing the textual data in the application form has not been explored so far. This paper proposes a comprehensive credit scorecard model technique that improves credit scorecard modeling though employing textual data analysis. This study uses a sample of loan application forms of a financial institution providing loan services in Yemen, which represents a real-world situation of the credit scoring and loan management. The sample contains a set of Arabic textual data attributes defining the applicants. The credit scoring model based on the text mining pre-processing and logistic regression techniques is proposed and evaluated through a comparison with a group of credit scorecard modeling techniques that use only the numeric attributes in the application form. The results show that adding the textual attributes analysis achieves higher classification effectiveness and outperforms the other traditional numerical data analysis techniques.

  20. The Split-Apply-Combine Strategy for Data Analysis

    Directory of Open Access Journals (Sweden)

    Hadley Wickham

    2011-04-01

    Full Text Available Many data analysis problems involve the application of a split-apply-combine strategy, where you break up a big problem into manageable pieces, operate on each piece independently and then put all the pieces back together. This insight gives rise to a new R package that allows you to smoothly apply this strategy, without having to worry about the type of structure in which your data is stored.The paper includes two case studies showing how these insights make it easier to work with batting records for veteran baseball players and a large 3d array of spatio-temporal ozone measurements.

  1. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  2. Laser photolysis-resonance fluorescence technique (LP-RF) applied to the study of reactions of atmospheric interest

    Science.gov (United States)

    Albaladejo, J.; Cuevas, C. A.; Notario, A.; Martínez, E.

    Atomic chlorine is highly reactive with a variety of organic and inorganic compounds so that relatively small concentrations can compete with the tropospheric oxidants (OH, O3 and NO3) in determining the tropospheric fate of such compounds [1]. Besides, there is a lot of evidence that bromine compounds play significant role in the ozone chemistry both in the troposphere and in the stratosphere [2]. In this work we show the laser photolysis-resonance fluorescence technique (LP-RF) applied to the study of gas phase reactions of halogen atoms with volatile organic compounds (VOCs) of interest in atmospheric chemistry [3]. By means of this technique is possible to measure the rate constants of theses reactions, and subsequently obtain the Arrhenius parameters. Halogens atoms are produced in a excess of the VOC and He, by photolyzing Cl2 at 308 nm to obtain Cl atoms, or CF2Br2 at 248 nm for Br atoms, both cases using a pulsed excimer laser. The radiation (135 nm) from a microwave-driven lamp, through which He containing a low concentrations of Cl2 or Br2 was flowed, was used to excite the resonance fluorescence from the corresponding halogen atom in the jacketed Pyrex reaction cell. Signal were obtained using photon-counting techniques in conjunction with multichannel scaling. The fluorescence signal from the PMT was processed by a preamplifier and sent to an multichannel scaler to collect the time-resolved signal. The multichannel scaler was coupled to a microcomputer for further kinetics analysis.

  3. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  4. Electrochemical microfluidic chip based on molecular imprinting technique applied for therapeutic drug monitoring.

    Science.gov (United States)

    Liu, Jiang; Zhang, Yu; Jiang, Min; Tian, Liping; Sun, Shiguo; Zhao, Na; Zhao, Feilang; Li, Yingchun

    2017-05-15

    In this work, a novel electrochemical detection platform was established by integrating molecularly imprinting technique with microfluidic chip and applied for trace measurement of three therapeutic drugs. The chip foundation is acrylic panel with designed grooves. In the detection cell of the chip, a Pt wire is used as the counter electrode and reference electrode, and a Au-Ag alloy microwire (NPAMW) with 3D nanoporous surface modified with electro-polymerized molecularly imprinted polymer (MIP) film as the working electrode. Detailed characterization of the chip and the working electrode was performed, and the properties were explored by cyclic voltammetry and electrochemical impedance spectroscopy. Two methods, respectively based on electrochemical catalysis and MIP/gate effect were employed for detecting warfarin sodium by using the prepared chip. The linearity of electrochemical catalysis method was in the range of 5×10(-6)-4×10(-4)M, which fails to meet clinical testing demand. By contrast, the linearity of gate effect was 2×10(-11)-4×10(-9)M with remarkably low detection limit of 8×10(-12)M (S/N=3), which is able to satisfy clinical assay. Then the system was applied for 24-h monitoring of drug concentration in plasma after administration of warfarin sodium in rabbit, and the corresponding pharmacokinetic parameters were obtained. In addition, the microfluidic chip was successfully adopted to analyze cyclophosphamide and carbamazepine, implying its good versatile ability. It is expected that this novel electrochemical microfluidic chip can act as a promising format for point-of-care testing via monitoring different analytes sensitively and conveniently.

  5. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  6. Applying Squeezing Technique to Clayrocks: Lessons Learned from Experiments at Mont Terri Rock Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, A. M.; Sanchez-Ledesma, D. M.; Tournassat, C.; Melon, A.; Gaucher, E.; Astudillo, E.; Vinsot, A.

    2013-07-01

    Knowledge of the pore water chemistry in clay rock formations plays an important role in determining radionuclide migration in the context of nuclear waste disposal. Among the different in situ and ex-situ techniques for pore water sampling in clay sediments and soils, squeezing technique dates back 115 years. Although different studies have been performed about the reliability and representativeness of squeezed pore waters, more of them were achieved on high porosity, high water content and unconsolidated clay sediments. A very few of them tackled the analysis of squeezed pore water from low-porosity, low water content and highly consolidated clay rocks. In this work, a specially designed and fabricated one-dimensional compression cell two directional fluid flow was used to extract and analyse the pore water composition of Opalinus Clay core samples from Mont Terri (Switzerland). The reproducibility of the technique is good and no ionic ultrafiltration, chemical fractionation or anion exclusion was found in the range of pressures analysed: 70-200 MPa. Pore waters extracted in this range of pressures do not decrease in concentration, which would indicate a dilution of water by mixing of the free pore water and the outer layers of double layer water (Donnan water). A threshold (safety) squeezing pressure of 175 MPa was established for avoiding membrane effects (ion filtering, anion exclusion, etc.) from clay particles induced by increasing pressures. Besides, the pore waters extracted at these pressures are representative of the Opalinus Clay formation from a direct comparison against in situ collected borehole waters. (Author)

  7. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    OpenAIRE

    Árpád Gyéresi; Eleonora Mircia; Brigitta Simon; Aura Rusu; Gabriel Hancu

    2013-01-01

    Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve...

  8. Nuclear analytical techniques applied to forensic chemistry; Aplicacion de tecnicas analiticas nucleares en quimica forense

    Energy Technology Data Exchange (ETDEWEB)

    Nicolau, Veronica; Montoro, Silvia [Universidad Nacional del Litoral, Santa Fe (Argentina). Facultad de Ingenieria Quimica. Dept. de Quimica Analitica; Pratta, Nora; Giandomenico, Angel Di [Consejo Nacional de Investigaciones Cientificas y Tecnicas, Santa Fe (Argentina). Centro Regional de Investigaciones y Desarrollo de Santa Fe

    1999-11-01

    Gun shot residues produced by firing guns are mainly composed by visible particles. The individual characterization of these particles allows distinguishing those ones containing heavy metals, from gun shot residues, from those having a different origin or history. In this work, the results obtained from the study of gun shot residues particles collected from hands are presented. The aim of the analysis is to establish whether a person has shot a firing gun has been in contact with one after the shot has been produced. As reference samples, particles collected hands of persons affected to different activities were studied to make comparisons. The complete study was based on the application of nuclear analytical techniques such as Scanning Electron Microscopy, Energy Dispersive X Ray Electron Probe Microanalysis and Graphite Furnace Atomic Absorption Spectrometry. The essays allow to be completed within time compatible with the forensic requirements. (author) 5 refs., 3 figs., 1 tab.; e-mail: csedax e adigian at arcride.edu.ar

  9. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    OpenAIRE

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar traum...

  10. An applied ethics analysis of best practice tourism entrepreneurs

    OpenAIRE

    2015-01-01

    Ethical entrepreneurship and by extension wider best practice are noble goals for the future of tourism. However, questions arise which concepts, such as values motivations, actions and challenges underpin these goals. This thesis seeks to answers these questions and in so doing develop an applied ethics analysis for best practice entrepreneurs in tourism. The research is situated in sustainable tourism, which is ethically very complex and has thus far been dominated by the economic, social a...

  11. Nonstandard Analysis Applied to Advanced Undergraduate Mathematics - Infinitesimal Modeling

    OpenAIRE

    Herrmann, Robert A.

    2003-01-01

    This is a Research and Instructional Development Project from the U. S. Naval Academy. In this monograph, the basic methods of nonstandard analysis for n-dimensional Euclidean spaces are presented. Specific rules are deveoped and these methods and rules are applied to rigorous integral and differential modeling. The topics include Robinson infinitesimals, limited and infinite numbers; convergence theory, continuity, *-transfer, internal definition, hyprefinite summation, Riemann-Stieltjes int...

  12. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  13. Inverting travel times with a triplication. [spline fitting technique applied to lunar seismic data reduction

    Science.gov (United States)

    Jarosch, H. S.

    1982-01-01

    A method based on the use of constrained spline fits is used to overcome the difficulties arising when body-wave data in the form of T-delta are reduced to the tau-p form in the presence of cusps. In comparison with unconstrained spline fits, the method proposed here tends to produce much smoother models which lie approximately in the middle of the bounds produced by the extremal method. The method is noniterative and, therefore, computationally efficient. The method is applied to the lunar seismic data, where at least one triplication is presumed to occur in the P-wave travel-time curve. It is shown, however, that because of an insufficient number of data points for events close to the antipode of the center of the lunar network, the present analysis is not accurate enough to resolve the problem of a possible lunar core.

  14. Structure-selection techniques applied to continuous-time nonlinear models

    Science.gov (United States)

    Aguirre, Luis A.; Freitas, Ubiratan S.; Letellier, Christophe; Maquet, Jean

    2001-10-01

    This paper addresses the problem of choosing the multinomials that should compose a polynomial mathematical model starting from data. The mathematical representation used is a nonlinear differential equation of the polynomial type. Some approaches that have been used in the context of discrete-time models are adapted and applied to continuous-time models. Two examples are included to illustrate the main ideas. Models obtained with and without structure selection are compared using topological analysis. The main differences between structure-selected models and complete structure models are: (i) the former are more parsimonious than the latter, (ii) a predefined fixed-point configuration can be guaranteed for the former, and (iii) the former set of models produce attractors that are topologically closer to the original attractor than those produced by the complete structure models.

  15. Harmonic and applied analysis from groups to signals

    CERN Document Server

    Mari, Filippo; Grohs, Philipp; Labate, Demetrio

    2015-01-01

    This contributed volume explores the connection between the theoretical aspects of harmonic analysis and the construction of advanced multiscale representations that have emerged in signal and image processing. It highlights some of the most promising mathematical developments in harmonic analysis in the last decade brought about by the interplay among different areas of abstract and applied mathematics. This intertwining of ideas is considered starting from the theory of unitary group representations and leading to the construction of very efficient schemes for the analysis of multidimensional data. After an introductory chapter surveying the scientific significance of classical and more advanced multiscale methods, chapters cover such topics as An overview of Lie theory focused on common applications in signal analysis, including the wavelet representation of the affine group, the Schrödinger representation of the Heisenberg group, and the metaplectic representation of the symplectic group An introduction ...

  16. Advanced examination techniques applied to the qualification of critical welds for the ITER correction coils

    CERN Document Server

    Sgobba, Stefano; Libeyre, Paul; Marcinek, Dawid Jaroslaw; Piguiet, Aline; Cécillon, Alexandre

    2015-01-01

    The ITER correction coils (CCs) consist of three sets of six coils located in between the toroidal (TF) and poloidal field (PF) magnets. The CCs rely on a Cable-in-Conduit Conductor (CICC), whose supercritical cooling at 4.5 K is provided by helium inlets and outlets. The assembly of the nozzles to the stainless steel conductor conduit includes fillet welds requiring full penetration through the thickness of the nozzle. Static and cyclic stresses have to be sustained by the inlet welds during operation. The entire volume of helium inlet and outlet welds, that are submitted to the most stringent quality levels of imperfections according to standards in force, is virtually uninspectable with sufficient resolution by conventional or computed radiography or by Ultrasonic Testing. On the other hand, X-ray computed tomography (CT) was successfully applied to inspect the full weld volume of several dozens of helium inlet qualification samples. The extensive use of CT techniques allowed a significant progress in the ...

  17. Formulation of Indomethacin Colon Targeted Delivery Systems Using Polysaccharides as Carriers by Applying Liquisolid Technique

    Directory of Open Access Journals (Sweden)

    Kadria A. Elkhodairy

    2014-01-01

    Full Text Available The present study aimed at the formulation of matrix tablets for colon-specific drug delivery (CSDD system of indomethacin (IDM by applying liquisolid (LS technique. A CSDD system based on time-dependent polymethacrylates and enzyme degradable polysaccharides was established. Eudragit RL 100 (E-RL 100 was employed as time-dependent polymer, whereas bacterial degradable polysaccharides were presented as LS systems loaded with the drug. Indomethacin-loaded LS systems were prepared using different polysaccharides, namely, guar gum (GG, pectin (PEC, and chitosan (CH, as carriers separately or in mixtures of different ratios of 1 : 3, 1 : 1, and 3 : 1. Liquisolid systems that displayed promising results concerning drug release rate in both pH 1.2 and pH 6.8 were compressed into tablets after the addition of the calculated amount of E-RL 100 and lubrication with magnesium stearate and talc in the ratio of 1 : 9. It was found that E-RL 100 improved the flowability and compressibility of all LS formulations. The release data revealed that all formulations succeeded to sustain drug release over a period of 24 hours. Stability study indicated that PEC-based LS system as well as its matrix tablets was stable over the period of storage (one year and could provide a minimum shelf life of two years.

  18. Attitude Exploration Using Factor Analysis Technique

    Directory of Open Access Journals (Sweden)

    Monika Raghuvanshi

    2016-12-01

    Full Text Available Attitude is a psychological variable that contains positive or negative evaluation about people or an environment. The growing generation possesses learning skills, so if positive attitude is inculcated at the right age, it might therefore become habitual. Students in the age group 14-20 years from the city of Bikaner, India, are the target population for this study. An inventory of 30Likert-type scale statements was prepared in order to measure attitude towards the environment and matters related to conservation. The primary data is collected though a structured questionnaire, using cluster sampling technique and analyzed using the IBM SPSS 23 statistical tool. Factor analysis is used to reduce 30 variables to a smaller number of more identifiable groups of variables. Results show that students “need more regulation and voluntary participation to protect the environment”, “need conservation of water and electricity”, “are concerned for undue wastage of water”, “need visible actions to protect the environment”, “need strengthening of the public transport system”, “are a little bit ignorant about the consequences of global warming”, “want prevention of water pollution by industries”, “need changing of personal habits to protect the environment”, and “don’t have firsthand experience of global warming”. Analysis revealed that nine factors obtained could explain about 58.5% variance in the attitude of secondary school students towards the environment in the city of Bikaner, India. The remaining 39.6% variance is attributed to other elements not explained by this analysis. A global campaign for improvement in attitude about environmental issues and its utility in daily lives may boost positive youth attitudes, potentially impacting worldwide. A cross-disciplinary approach may be developed by teaching along with other related disciplines such as science, economics, and social studies etc.

  19. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  20. Comparison of motion correction techniques applied to functional near-infrared spectroscopy data from children

    Science.gov (United States)

    Hu, Xiao-Su; Arredondo, Maria M.; Gomba, Megan; Confer, Nicole; DaSilva, Alexandre F.; Johnson, Timothy D.; Shalinsky, Mark; Kovelman, Ioulia

    2015-12-01

    Motion artifacts are the most significant sources of noise in the context of pediatric brain imaging designs and data analyses, especially in applications of functional near-infrared spectroscopy (fNIRS), in which it can completely affect the quality of the data acquired. Different methods have been developed to correct motion artifacts in fNIRS data, but the relative effectiveness of these methods for data from child and infant subjects (which is often found to be significantly noisier than adult data) remains largely unexplored. The issue is further complicated by the heterogeneity of fNIRS data artifacts. We compared the efficacy of the six most prevalent motion artifact correction techniques with fNIRS data acquired from children participating in a language acquisition task, including wavelet, spline interpolation, principal component analysis, moving average (MA), correlation-based signal improvement, and combination of wavelet and MA. The evaluation of five predefined metrics suggests that the MA and wavelet methods yield the best outcomes. These findings elucidate the varied nature of fNIRS data artifacts and the efficacy of artifact correction methods with pediatric populations, as well as help inform both the theory and practice of optical brain imaging analysis.

  1. Joint regression analysis and AMMI model applied to oat improvement

    Science.gov (United States)

    Oliveira, A.; Oliveira, T. A.; Mejza, S.

    2012-09-01

    In our work we present an application of some biometrical methods useful in genotype stability evaluation, namely AMMI model, Joint Regression Analysis (JRA) and multiple comparison tests. A genotype stability analysis of oat (Avena Sativa L.) grain yield was carried out using data of the Portuguese Plant Breeding Board, sample of the 22 different genotypes during the years 2002, 2003 and 2004 in six locations. In Ferreira et al. (2006) the authors state the relevance of the regression models and of the Additive Main Effects and Multiplicative Interactions (AMMI) model, to study and to estimate phenotypic stability effects. As computational techniques we use the Zigzag algorithm to estimate the regression coefficients and the agricolae-package available in R software for AMMI model analysis.

  2. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  3. Investigation about the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils

    Directory of Open Access Journals (Sweden)

    Adriano Pinto Mariano

    2009-10-01

    Full Text Available This work investigated the efficiency of the bioaugmentation technique when applied to diesel oil contaminated soils collected at three service stations. Batch biodegradation experiments were carried out in Bartha biometer flasks (250 mL used to measure the microbial CO2 production. Biodegradation efficiency was also measured by quantifying the concentration of hydrocarbons. In addition to the biodegradation experiments, the capability of the studied cultures and the native microorganisms to biodegrade the diesel oil purchased from a local service station, was verified using a technique based on the redox indicator 2,6 -dichlorophenol indophenol (DCPIP. Results obtained with this test showed that the inocula used in the biodegradation experiments were able to degrade the diesel oil and the tests carried out with the native microorganisms indicated that these soils had a microbiota adapted to degrade the hydrocarbons. In general, no gain was obtained with the addition of microorganisms or even negative effects were observed in the biodegradation experiments.Este trabalho investigou a eficiência da técnica do bioaumento quando aplicada a solos contaminados com óleo diesel coletados em três postos de combustíveis. Experimentos de biodegradação foram realizados em frascos de Bartha (250 mL, usados para medir a produção microbiana de CO2. A eficiência de biodegradação também foi quantificada pela concentração de hidrocarbonetos. Conjuntamente aos experimentos de biodegradação, a capacidade das culturas estudadas e dos microrganismos nativos em biodegradar óleo diesel comprado de um posto de combustíveis local, foi verificada utilizando-se a técnica baseada no indicador redox 2,6 - diclorofenol indofenol (DCPIP. Resultados obtidos com esse teste mostraram que os inóculos empregados nos experimentos de biodegradação foram capazes de biodegradar óleo diesel e os testes com os microrganismos nativos indicaram que estes solos

  4. Differential item functioning analysis by applying multiple comparison procedures.

    Science.gov (United States)

    Eusebi, Paolo; Kreiner, Svend

    2015-01-01

    Analysis within a Rasch measurement framework aims at development of valid and objective test score. One requirement of both validity and objectivity is that items do not show evidence of differential item functioning (DIF). A number of procedures exist for the assessment of DIF including those based on analysis of contingency tables by Mantel-Haenszel tests and partial gamma coefficients. The aim of this paper is to illustrate Multiple Comparison Procedures (MCP) for analysis of DIF relative to a variable defining a very large number of groups, with an unclear ordering with respect to the DIF effect. We propose a single step procedure controlling the false discovery rate for DIF detection. The procedure applies for both dichotomous and polytomous items. In addition to providing evidence against a hypothesis of no DIF, the procedure also provides information on subset of groups that are homogeneous with respect to the DIF effect. A stepwise MCP procedure for this purpose is also introduced.

  5. Evaluation of bitterness in white wine applying descriptive analysis, time-intensity analysis, and temporal dominance of sensations analysis.

    Science.gov (United States)

    Sokolowsky, Martina; Fischer, Ulrich

    2012-06-30

    Bitterness in wine, especially in white wine, is a complex and sensitive topic as it is a persistent sensation with negative connotation by consumers. However, the molecular base for bitter taste in white wines is still widely unknown yet. At the same time studies dealing with bitterness have to cope with the temporal dynamics of bitter perception. The most common method to describe bitter taste is the static measurement amongst other attributes during a descriptive analysis. A less frequently applied method, the time-intensity analysis, evaluates the temporal gustatory changes focusing on bitterness alone. The most recently developed multidimensional approach of the temporal dominance of sensations method reveals the temporal dominance of bitter taste in relation to other attributes. In order to compare the results comprised with these different sensory methodologies, 13 commercial white wines were evaluated by the same panel. To facilitate a statistical comparison, parameters were extracted from bitterness curves obtained from time-intensity and temporal dominance of sensations analysis and were compared to bitter intensity as well as bitter persistency based on descriptive analysis. Analysis of variance differentiated significantly the wines regarding all measured bitterness parameters obtained from the three sensory techniques. Comparing the information of all sensory parameters by multiple factor analysis and correlation, each technique provided additional valuable information regarding the complex bitter perception in white wine.

  6. A comparison of new, old and future densiometic techniques as applied to volcanologic study.

    Science.gov (United States)

    Pankhurst, Matthew; Moreland, William; Dobson, Kate; Þórðarson, Þorvaldur; Fitton, Godfrey; Lee, Peter

    2015-04-01

    The density of any material imposes a primary control upon its potential or actual physical behaviour in relation to its surrounds. It follows that a thorough understanding of the physical behaviour of dynamic, multi-component systems, such as active volcanoes, requires knowledge of the density of each component. If we are to accurately predict the physical behaviour of synthesized or natural volcanic systems, quantitative densiometric measurements are vital. The theoretical density of melt, crystals and bubble phases may be calculated using composition, structure, temperature and pressure inputs. However, measuring the density of natural, non-ideal, poly-phase materials remains problematic, especially if phase specific measurement is important. Here we compare three methods; Archimedes principle, He-displacement pycnometry and X-ray micro computed tomography (XMT) and discuss the utility and drawbacks of each in the context of modern volcanologic study. We have measured tephra, ash and lava from the 934 AD Eldgjá eruption (Iceland), and the 2010 AD Eyjafjallajökull eruption (Iceland), using each technique. These samples exhibit a range of particle sizes, phases and textures. We find that while the Archimedes method remains a useful, low-cost technique to generate whole-rock density data, relative precision is problematic at small particles sizes. Pycnometry offers a more precise whole-rock density value, at a comparable cost-per-sample. However, this technique is based upon the assumption pore spaces within the sample are equally available for gas exchange, which may or may not be the case. XMT produces 3D images, at resolutions from nm to tens of µm per voxel where X-ray attenuation is a qualitative measure of relative electron density, expressed as greyscale number/brightness (usually 16-bit). Phases and individual particles can be digitally segmented according to their greyscale and other characteristics. This represents a distinct advantage over both

  7. Study for applying microwave power saturation technique on fingernail/EPR dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byeong Ryong; Choi, Hoon; Nam, Hyun Ill; Lee, Byung Ill [Radiation Health Research Institute, Seoul (Korea, Republic of)

    2012-10-15

    There is growing recognition worldwide of the need to develop effective uses of dosimetry methods to assess unexpected exposure to radiation in the event of a large scale event. One of physically based dosimetry methods electron paramagnetic resonance (EPR) spectroscopy has been applied to perform retrospective radiation dosimetry using extracted samples of tooth enamel and nail(fingernail and toenail), following radiation accidents and exposures resulting from weapon use, testing, and production. Human fingernails are composed largely of a keratin, which consists of {alpha} helical peptide chains that are twisted into a left handed coil and strengthened by disulphide cross links. Ionizing radiation generates free radicals in the keratin matrix, and these radicals are stable over a relatively long period (days to weeks). Most importantly, the number of radicals is proportional to the magnitude of the dose over a wide dose range (0{approx}30 Gy). Also, dose can be estimated at four different locations on the human body, providing information on the homogeneity of the radiation exposure. And The results from EPR nail dosimetry are immediately available However, relatively large background signal (BKS) converted from mechanically induced signal (MIS) after cutting process of fingernail, normally overlaps with the radiation induced signal (RIS), make it difficult to estimate accurate dose accidental exposure. Therefore, estimation method using dose response curve was difficult to ensure reliability below 5 Gy. In this study, In order to overcome these disadvantages, we measured the reactions of RIS and BKS (MIS) according to the change of Microwave power level, and researched about the applicability of the Power saturation technique at low dose.

  8. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  9. Imaging techniques applied to quality control of civil manufactured goods obtained starting from ready-to-use mixtures

    Science.gov (United States)

    Bonifazi, Giuseppe; Castaldi, Federica

    2003-05-01

    Concrete materials obtained from the utilization of pre-mixed and ready to use products (central mix-concrete) are more and more used. They represent a big portion of the civil construction market. Such products are used at different scale, ranging from small scale works, as those commonly realized inside and house, an apartment, etc. or at big civil or industrial scale works. In both cases the problem to control the mixtures and the final work is usually realized through the analysis of properly collected samples. Through appropriate sampling it can be derived objective parameters, as size class distribution and composition of the constituting particulate matter, or mechanical characteristics of the sample itself. An important parameter not considered by the previous mentioned approach is "segregation", that is the possibility that some particulate materials migrate preferentially in some zones of the mixtures and/or of the final product. Such a behavior dramatically influences the quality of the product and of the final manufactured good. Actually this behavior is only studied adopting a human based visual approach. Not repeatable analytical procedures or quantitative data processing exist. In this paper a procedure fully based on image processing techniques is described and applied. Results are presented and analyzed with reference to industrial products. A comparison is also made between the new proposed digital imaging based techniques and the analyses usually carried out at industrial laboratory scale for standard quality control.

  10. Thermal Analysis Applied to Verapamil Hydrochloride Characterization in Pharmaceutical Formulations

    Directory of Open Access Journals (Sweden)

    Maria Irene Yoshida

    2010-04-01

    Full Text Available Thermogravimetry (TG and differential scanning calorimetry (DSC are useful techniques that have been successfully applied in the pharmaceutical industry to reveal important information regarding the physicochemical properties of drug and excipient molecules such as polymorphism, stability, purity, formulation compatibility among others. Verapamil hydrochloride shows thermal stability up to 180 °C and melts at 146 °C, followed by total degradation. The drug is compatible with all the excipients evaluated. The drug showed degradation when subjected to oxidizing conditions, suggesting that the degradation product is 3,4-dimethoxybenzoic acid derived from alkyl side chain oxidation. Verapamil hydrochloride does not present the phenomenon of polymorphism under the conditions evaluated. Assessing the drug degradation kinetics, the drug had a shelf life (t90 of 56.7 years and a pharmaceutical formulation showed t90 of 6.8 years showing their high stability.

  11. Erasing the Milky Way: new cleaning technique applied to GBT intensity mapping data

    CERN Document Server

    Wolz, L; Abdalla, F B; Anderson, C M; Chang, T -C; Li, Y -C; Masui, K W; Switzer, E; Pen, U -L; Voytek, T C; Yadav, J

    2015-01-01

    We present the first application of a new foreground removal pipeline to the current leading HI intensity mapping dataset, obtained by the Green Bank Telescope (GBT). We study the 15hr and 1hr field data of the GBT observations previously presented in Masui et al. (2013) and Switzer et al. (2013) covering about 41 square degrees at 0.6 < z < 1.0 which overlaps with the WiggleZ galaxy survey employed for the cross-correlation with the maps. In the presented pipeline, we subtract the Galactic foreground continuum and the point source contaminations using an independent component analysis technique (fastica) and develop a description for a Fourier-based optimal weighting estimator to compute the temperature power spectrum of the intensity maps and cross-correlation with the galaxy survey data. We show that fastica is a reliable tool to subtract diffuse and point-source emission by using the non-Gaussian nature of their probability functions. The power spectra of the intensity maps and the cross-correlation...

  12. Digital Image Correlation Techniques Applied to Large Scale Rocket Engine Testing

    Science.gov (United States)

    Gradl, Paul R.

    2016-01-01

    Rocket engine hot-fire ground testing is necessary to understand component performance, reliability and engine system interactions during development. The J-2X upper stage engine completed a series of developmental hot-fire tests that derived performance of the engine and components, validated analytical models and provided the necessary data to identify where design changes, process improvements and technology development were needed. The J-2X development engines were heavily instrumented to provide the data necessary to support these activities which enabled the team to investigate any anomalies experienced during the test program. This paper describes the development of an optical digital image correlation technique to augment the data provided by traditional strain gauges which are prone to debonding at elevated temperatures and limited to localized measurements. The feasibility of this optical measurement system was demonstrated during full scale hot-fire testing of J-2X, during which a digital image correlation system, incorporating a pair of high speed cameras to measure three-dimensional, real-time displacements and strains was installed and operated under the extreme environments present on the test stand. The camera and facility setup, pre-test calibrations, data collection, hot-fire test data collection and post-test analysis and results are presented in this paper.

  13. Imaging techniques applied to the study of fluids in porous media

    Energy Technology Data Exchange (ETDEWEB)

    Tomutsa, L.; Brinkmeyer, A.; Doughty, D.

    1993-04-01

    A synergistic rock characterization methodology has been developed. It derives reservoir engineering parameters from X-ray tomography (CT) scanning, computer assisted petrographic image analysis, minipermeameter measurements, and nuclear magnetic resonance imaging (NMRI). This rock characterization methodology is used to investigate the effect of small-scale rock heterogeneity on oil distribution and recovery. It is also used to investigate the applicability of imaging technologies to the development of scaleup procedures from core plug to whole core, by comparing the results of detailed simulations with the images ofthe fluid distributions observed by CT scanning. By using the rock and fluid detailed data generated by imaging technology describe, one can verify directly, in the laboratory, various scaling up techniques. Asan example, realizations of rock properties statistically and spatially compatible with the observed values are generated by one of the various stochastic methods available (fuming bands) and are used as simulator input. The simulation results were compared with both the simulation results using the true rock properties and the fluid distributions observed by CT. Conclusions regarding the effect of the various permeability models on waterflood oil recovery were formulated.

  14. Imaging techniques applied to the study of fluids in porous media

    Energy Technology Data Exchange (ETDEWEB)

    Tomutsa, L.; Doughty, D.; Mahmood, S.; Brinkmeyer, A.; Madden, M.P.

    1991-01-01

    A detailed understanding of rock structure and its influence on fluid entrapment, storage capacity, and flow behavior can improve the effective utilization and design of methods to increase the recovery of oil and gas from petroleum reservoirs. The dynamics of fluid flow and trapping phenomena in porous media was investigated. Miscible and immiscible displacement experiments in heterogeneous Berea and Shannon sandstone samples were monitored using X-ray computed tomography (CT scanning) to determine the effect of heterogeneities on fluid flow and trapping. The statistical analysis of pore and pore throat sizes in thin sections cut from these sandstone samples enabled the delineation of small-scale spatial distributions of porosity and permeability. Multiphase displacement experiments were conducted with micromodels constructed using thin slabs of the sandstones. The combination of the CT scanning, thin section, and micromodel techniques enables the investigation of how variations in pore characteristics influence fluid front advancement, fluid distributions, and fluid trapping. Plugs cut from the sandstone samples were investigated using high resolution nuclear magnetic resonance imaging permitting the visualization of oil, water or both within individual pores. The application of these insights will aid in the proper interpretation of relative permeability, capillary pressure, and electrical resistivity data obtained from whole core studies. 7 refs., 14 figs., 2 tabs.

  15. Hyperspectral imaging techniques applied to the monitoring of wine waste anaerobic digestion process

    Science.gov (United States)

    Serranti, Silvia; Fabbri, Andrea; Bonifazi, Giuseppe

    2012-11-01

    An anaerobic digestion process, finalized to biogas production, is characterized by different steps involving the variation of some chemical and physical parameters related to the presence of specific biomasses as: pH, chemical oxygen demand (COD), volatile solids, nitrate (NO3-) and phosphate (PO3-). A correct process characterization requires a periodical sampling of the organic mixture in the reactor and a further analysis of the samples by traditional chemical-physical methods. Such an approach is discontinuous, time-consuming and expensive. A new analytical approach based on hyperspectral imaging in the NIR field (1000 to 1700 nm) is investigated and critically evaluated, with reference to the monitoring of wine waste anaerobic digestion process. The application of the proposed technique was addressed to identify and demonstrate the correlation existing, in terms of quality and reliability of the results, between "classical" chemical-physical parameters and spectral features of the digestate samples. Good results were obtained, ranging from a R2=0.68 and a RMSECV=12.83 mg/l for nitrate to a R2=0.90 and a RMSECV=5495.16 mg O2/l for COD. The proposed approach seems very useful in setting up innovative control strategies allowing for full, continuous control of the anaerobic digestion process.

  16. Zoneless and Mixture techniques applied to the Integrated Brazilian PSHA using GEM-OpenQuake

    Science.gov (United States)

    Pirchiner, M.; Drouet, S.; Assumpcao, M.

    2013-12-01

    The main goal of this work is to propose some variations to the classic Probabilistic Seismic Hazard Analysis (PSHA) calculations, on one hand, applying the zoneless methodology to seismic source activity characterization and, on the other hand, using the gaussian mixture models to mix Ground Motion Prediction Equation (GMPE) models onto a mixed model. Our actual knowledge of the Brazilian intraplate seismicity does not allow us to identify the causative neotectonic active faults with confidence. This issue makes difficult the characterization of main seismic sources and the computation of the Gutenberg-Richter relation. Indeed seismic zonings made by different specialist could have big differences, while the zone less approach imposes a quantitative method to seismic source characterization, avoiding the subjective source zone definition. In addition, the low seismicity rate and the limited coverage in space and time of the seismic networks, do not offer enough observations to fit a confident GMPE to this region. In this case, our purpose was use a Gaussian Mixture Model to estimate a composed model from pre-existents well-fitted GMPE models which better describes the observed peak ground motion data. The other methodological evaluation is to use the OpenQuake engine (a Global Earthquake Model's initiative) for the hazard calculation. The logic tree input will allow us, in near future, to combine with weights, other hazard models from different specialists. We expect that these results will offer a new and solid basis to upgrade the brazilian civil engineering seismic rules.

  17. LAMQS analysis applied to ancient Egyptian bronze coins

    Energy Technology Data Exchange (ETDEWEB)

    Torrisi, L., E-mail: lorenzo.torrisi@unime.i [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caridi, F.; Giuffrida, L.; Torrisi, A. [Dipartimento di Fisica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Mondio, G.; Serafino, T. [Dipartimento di Fisica della Materia ed Ingegneria Elettronica dell' Universita di Messina, Salita Sperone, 31, 98166 Messina (Italy); Caltabiano, M.; Castrizio, E.D. [Dipartimento di Lettere e Filosofia dell' Universita di Messina, Polo Universitario dell' Annunziata, 98168 Messina (Italy); Paniz, E.; Salici, A. [Carabinieri, Reparto Investigazioni Scientifiche, S.S. 114, Km. 6, 400 Tremestieri, Messina (Italy)

    2010-05-15

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  18. Classical mechanics approach applied to analysis of genetic oscillators.

    Science.gov (United States)

    Vasylchenkova, Anastasiia; Mraz, Miha; Zimic, Nikolaj; Moskon, Miha

    2016-04-05

    Biological oscillators present a fundamental part of several regulatory mechanisms that control the response of various biological systems. Several analytical approaches for their analysis have been reported recently. They are, however, limited to only specific oscillator topologies and/or to giving only qualitative answers, i.e., is the dynamics of an oscillator given the parameter space oscillatory or not. Here we present a general analytical approach that can be applied to the analysis of biological oscillators. It relies on the projection of biological systems to classical mechanics systems. The approach is able to provide us with relatively accurate results in the meaning of type of behaviour system reflects (i.e. oscillatory or not) and periods of potential oscillations without the necessity to conduct expensive numerical simulations. We demonstrate and verify the proposed approach on three different implementations of amplified negative feedback oscillator.

  19. Analysis on applied effectiveness of near-surface aquifers exploration with NMR technique in North Western arid area%地面核磁共振技术勘查西北干旱浅层地下水效果浅析

    Institute of Scientific and Technical Information of China (English)

    杨桂新; 武毅; 郭建强; 朱庆俊

    2000-01-01

    The surface NMR technique prospecting groundwater is a new applied field,which is an only technique for a direct detection of groundwater in the world at present. NMR instrument allows invasive measurements of a NMR signal from subsurface water-saturated layers. Direct measurement of the signal from water molecules guarantees a high reliability of water detection. Interpretation of experimental date reveals location of aquifers, their depth and water content. Mean size of water-saturated rock pores can also be estimated. Sine the NMR instrument was introduced into our institute in 1999, It had been used to test the different near-surface aquifers explorations in the loess land of Shanxi Pr. and the south of Ningxia Pr., and a certain field results were obtained. Based on analysis of these results, this article discusses on the existing problem of the prospecting reliability in strong noises background, and the prospecting results comparison using a transmitting antenna of square loop and figure 8 shaped loop at the same testing site.%地面核磁共振技术勘查地下水是一种新的应用领域,该技术是目前国际上唯一的能直接进行地下水勘查的技术.我所自一九九九年引进法国Numis仪器以来,先后在陕西黄土区、宁南黄土区等地开展寻找不同类型浅层地下水的试验工作,取得了一定的野外成果.本文是在分析总结这些成果的基础上,针对存在的问题如强干扰背景下勘查结果的可靠性,同一测点"8”字型与"正方形”发射线框二者勘查结果比较等方面进行探讨分析.

  20. Shape analysis applied in heavy ion reactions near Fermi energy

    Science.gov (United States)

    Zhang, S.; Huang, M.; Wada, R.; Liu, X.; Lin, W.; Wang, J.

    2017-03-01

    A new method is proposed to perform shape analyses and to evaluate their validity in heavy ion collisions near the Fermi energy. In order to avoid erroneous values of shape parameters in the calculation, a test particle method is utilized in which each nucleon is represented by n test particles, similar to that used in the Boltzmann–Uehling–Uhlenbeck (BUU) calculations. The method is applied to the events simulated by an antisymmetrized molecular dynamics model. The geometrical shape of fragments is reasonably extracted when n = 100 is used. A significant deformation is observed for all fragments created in the multifragmentation process. The method is also applied to the shape of the momentum distribution for event classification. In the momentum case, the errors in the eigenvalue calculation become much smaller than those of the geometrical shape analysis and the results become similar between those with and without the test particle method, indicating that in intermediate heavy ion collisions the shape analysis of momentum distribution can be used for the event classification without the test particle method.

  1. Finite-element technique applied to heat conduction in solids with temperature dependent thermal conductivity

    Science.gov (United States)

    Aguirre-Ramirez, G.; Oden, J. T.

    1969-01-01

    Finite element method applied to heat conduction in solids with temperature dependent thermal conductivity, using nonlinear constitutive equation for heat ABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGHIABCDEFGH

  2. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  3. Differentiating between spatial and temporal effects by applying modern data analyzing techniques to measured soil moisture data

    Science.gov (United States)

    Hohenbrink, Tobias L.; Lischeid, Gunnar; Schindler, Uwe

    2013-04-01

    Large data sets containing time series of soil hydrological variables exist due to extensive monitoring work in the last decades. The interplay of different processes and influencing factors cause spatial and temporal patterns which contribute to the total variance. That implies that monitoring data sets contain information about the most relevant processes. That information can be extracted using modern data analysis techniques. Our objectives were (i) to decompose the total variance of an example data set of measured soil moisture time series in independent components and (ii) relate them to specific influencing factors. Soil moisture had been measured at 12 plots in an Albeluvisol located in Müncheberg, northeastern Germany, between May 1st, 2008 and July 1st, 2011. Each plot was equipped with FDR probes in 7 depths between 30 cm and 300 cm. Six plots were cultivated with winter rye and silage maize (Crop Rotation System I) and the other six with silage maize, winter rye/millet, triticale/lucerne and lucerne (Crop Rotation System II). We applied a principal component analysis to the soil moisture data set. The first component described the mean behavior in time of all soil moisture time series. The second component reflected the impact of soil depth. Together they explained 80 % of the data set's total variance. An analysis of the first two components confirmed that measured plots showed similar signal damping extend in each depth. The fourth component revealed the impact of the two different crop rotation systems which explained about 4 % of the total variance and 13 % of the spatial variance of soil moisture data. That is only a minor fraction compared to small scale soil texture heterogeneity effects. Principal component analysis has proven to be a useful tool to extract less apparent signals.

  4. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  5. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  6. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  7. A multiblock grid generation technique applied to a jet engine configuration

    Science.gov (United States)

    Stewart, Mark E. M.

    1992-01-01

    Techniques are presented for quickly finding a multiblock grid for a 2D geometrically complex domain from geometrical boundary data. An automated technique for determining a block decomposition of the domain is explained. Techniques for representing this domain decomposition and transforming it are also presented. Further, a linear optimization method may be used to solve the equations which determine grid dimensions within the block decomposition. These algorithms automate many stages in the domain decomposition and grid formation process and limit the need for human intervention and inputs. They are demonstrated for the meridional or throughflow geometry of a bladed jet engine configuration.

  8. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  9. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H. [Center for Experimental Nuclear Physics and Astrophysics, and Department of Physics, University of Washington, Seattle, WA (United States); Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L. [Pacific Northwest National Laboratory, Richland, WA (United States); Avignone, F. T. [Department of Physics and Astronomy, University of South Carolina, Columbia, SC (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States); Baldenegro-Barrera, C. X.; Bertrand, F. E. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  10. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  11. Radial velocity data analysis with compressed sensing techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2017-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian process framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick Observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  12. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  13. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  14. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  15. State of the Art Review for Applying Computational Intelligence and Machine Learning Techniques to Portfolio Optimisation

    CERN Document Server

    Hurwitz, Evan

    2009-01-01

    Computational techniques have shown much promise in the field of Finance, owing to their ability to extract sense out of dauntingly complex systems. This paper reviews the most promising of these techniques, from traditional computational intelligence methods to their machine learning siblings, with particular view to their application in optimising the management of a portfolio of financial instruments. The current state of the art is assessed, and prospective further work is assessed and recommended

  16. A Survey on Data Mining Techniques Applied to Electricity-Related Time Series Forecasting

    OpenAIRE

    Francisco Martínez-Álvarez; Alicia Troncoso; Gualberto Asencio-Cortés; Riquelme, José C

    2015-01-01

    Data mining has become an essential tool during the last decade to analyze large sets of data. The variety of techniques it includes and the successful results obtained in many application fields, make this family of approaches powerful and widely used. In particular, this work explores the application of these techniques to time series forecasting. Although classical statistical-based methods provides reasonably good results, the result of the application of data mining outperforms those of ...

  17. Neutron activation analysis applied to nutritional and foodstuff studies

    Energy Technology Data Exchange (ETDEWEB)

    Maihara, Vera A.; Santos, Paola S.; Moura, Patricia L.C.; Castro, Lilian P. de, E-mail: vmaihara@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Avegliano, Roseane P., E-mail: pagliaro@usp.b [Universidade de Sao Paulo (USP), SP (Brazil). Coordenadoria de Assistencia Social. Div. de Alimentacao

    2009-07-01

    Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)

  18. Wavelets Applied to CMB Maps a Multiresolution Analysis for Denoising

    CERN Document Server

    Sanz, J L; Cayon, L; Martínez-González, E; Barriero, R B; Toffolatti, L

    1999-01-01

    Analysis and denoising of Cosmic Microwave Background (CMB) maps are performed using wavelet multiresolution techniques. The method is tested on $12^{\\circ}.8\\times 12^{\\circ}.8$ maps with resolution resembling the experimental one expected for future high resolution space observations. Semianalytic formulae of the variance of wavelet coefficients are given for the Haar and Mexican Hat wavelet bases. Results are presented for the standard Cold Dark Matter (CDM) model. Denoising of simulated maps is carried out by removal of wavelet coefficients dominated by instrumental noise. CMB maps with a signal-to-noise, $S/N \\sim 1$, are denoised with an error improvement factor between 3 and 5. Moreover we have also tested how well the CMB temperature power spectrum is recovered after denoising. We are able to reconstruct the $C_{\\ell}$'s up to $l\\sim 1500$ with errors always below $20% $ in cases with $S/N \\ge 1$.

  19. Finite element analysis applied to dentoalveolar trauma: methodology description.

    Science.gov (United States)

    da Silva, B R; Moreira Neto, J J S; da Silva, F I; de Aguiar, A S W

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated.

  20. Comparison Between two FMEA Analysis Applied to Dairy

    Directory of Open Access Journals (Sweden)

    Alexandre de Paula Peres

    2010-06-01

    Full Text Available The FMEA (Failure Mode and Effect Analysis is a methodology that has been used in environmental risk assessment during the production process. Although the environmental certification means strengthening corporate image and ensuring their stay in the market, it is still very costly, particularly for small and medium businesses. Given this, the FMEA can be a benchmark for companies to start to diagnose the environmental risk caused by them. This methodology was used to diagnose differences in environmental concern and environmental controls exercised in two dairy plants from Lavras. By applying this method, one can observe different applications on the tables found in business: diagnosis and confirmation of the risks of controls taken.

  1. Downside Risk analysis applied to Hedge Funds universe

    CERN Document Server

    Perello, J

    2006-01-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires a high precision risk evaluation and an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater (or lower) than investor's goal. We study several risk indicators using the Gaussian case as a benchmark and apply them to the Credit Suisse/Tremont Investable Hedge Fund Index Data.

  2. Applying machine learning and image feature extraction techniques to the problem of cerebral aneurysm rupture

    Directory of Open Access Journals (Sweden)

    Steren Chabert

    2017-01-01

    Full Text Available Cerebral aneurysm is a cerebrovascular disorder characterized by a bulging in a weak area in the wall of an artery that supplies blood to the brain. It is relevant to understand the mechanisms leading to the apparition of aneurysms, their growth and, more important, leading to their rupture. The purpose of this study is to study the impact on aneurysm rupture of the combination of different parameters, instead of focusing on only one factor at a time as is frequently found in the literature, using machine learning and feature extraction techniques. This discussion takes relevance in the context of the complex decision that the physicians have to take to decide which therapy to apply, as each intervention bares its own risks, and implies to use a complex ensemble of resources (human resources, OR, etc. in hospitals always under very high work load. This project has been raised in our actual working team, composed of interventional neuroradiologist, radiologic technologist, informatics engineers and biomedical engineers, from Valparaiso public Hospital, Hospital Carlos van Buren, and from Universidad de Valparaíso – Facultad de Ingeniería and Facultad de Medicina. This team has been working together in the last few years, and is now participating in the implementation of an “interdisciplinary platform for innovation in health”, as part of a bigger project leaded by Universidad de Valparaiso (PMI UVA1402. It is relevant to emphasize that this project is made feasible by the existence of this network between physicians and engineers, and by the existence of data already registered in an orderly manner, structured and recorded in digital format. The present proposal arises from the description in nowadays literature that the actual indicators, whether based on morphological description of the aneurysm, or based on characterization of biomechanical factor or others, these indicators were shown not to provide sufficient information in order

  3. Using Link Analysis Technique with a Modified Shortest-Path Algorithm to Fight Money Laundering

    Institute of Scientific and Technical Information of China (English)

    CHEN Yunkai; MAI Quanwen; LU Zhengding

    2006-01-01

    Effective link analysis techniques are needed to help law enforcement and intelligence agencies fight money laundering.This paper presents a link analysis technique that uses a modified shortest-path algorithms to identify the strongest association paths between entities in a money laundering network.Based on two-tree Dijkstra and Priority-First-Search (PFS) algorithm, a modified algorithm is presented.To apply the algorithm, a network representation transformation is made first.

  4. Continuous Wavelet and Hilbert-Huang Transforms Applied for Analysis of Active and Reactive Power Consumption

    Directory of Open Access Journals (Sweden)

    Avdakovic Samir

    2014-08-01

    Full Text Available Analysis of power consumption presents a very important issue for power distribution system operators. Some power system processes such as planning, demand forecasting, development, etc.., require a complete understanding of behaviour of power consumption for observed area, which requires appropriate techniques for analysis of available data. In this paper, two different time-frequency techniques are applied for analysis of hourly values of active and reactive power consumption from one real power distribution transformer substation in urban part of Sarajevo city. Using the continuous wavelet transform (CWT with wavelet power spectrum and global wavelet spectrum some properties of analysed time series are determined. Then, empirical mode decomposition (EMD and Hilbert-Huang Transform (HHT are applied for the analyses of the same time series and the results showed that both applied approaches can provide very useful information about the behaviour of power consumption for observed time interval and different period (frequency bands. Also it can be noticed that the results obtained by global wavelet spectrum and marginal Hilbert spectrum are very similar, thus confirming that both approaches could be used for identification of main properties of active and reactive power consumption time series.

  5. Emerging techniques for soil analysis via mid-infrared spectroscopy

    Science.gov (United States)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  6. Digital filtering techniques applied to electric power systems protection; Tecnicas de filtragem digital aplicadas a protecao de sistemas eletricos de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Brito, Helio Glauco Ferreira

    1996-12-31

    This work introduces an analysis and a comparative study of some of the techniques for digital filtering of the voltage and current waveforms from faulted transmission lines. This study is of fundamental importance for the development of algorithms applied to digital protection of electric power systems. The techniques studied are based on the Discrete Fourier Transform theory, the Walsh functions and the Kalman filter theory. Two aspects were emphasized in this study: Firstly, the non-recursive techniques were analysed with the implementation of filters based on Fourier theory and the Walsh functions. Secondly, recursive techniques were analyzed, with the implementation of the filters based on the Kalman theory and once more on the Fourier theory. (author) 56 refs., 25 figs., 16 tabs.

  7. Synchrotron and simulations techniques applied to problems in materials science: catalysts and Azul Maya pigments.

    Science.gov (United States)

    Chianelli, Russell R; Perez De la Rosa, Myriam; Meitzner, George; Siadati, Mohammed; Berhault, Gilles; Mehta, Apurva; Pople, John; Fuentes, Sergio; Alonzo-Nuñez, Gabriel; Polette, Lori A

    2005-03-01

    Development of synchrotron techniques for the determination of the structure of disordered, amorphous and surface materials has exploded over the past 20 years owing to the increasing availability of high-flux synchrotron radiation and the continuing development of increasingly powerful synchrotron techniques. These techniques are available to materials scientists who are not necessarily synchrotron scientists through interaction with effective user communities that exist at synchrotrons such as the Stanford Synchrotron Radiation Laboratory. In this article the application of multiple synchrotron characterization techniques to two classes of materials defined as 'surface compounds' is reviewed. One class of surface compounds are materials like MoS(2-x)C(x) that are widely used petroleum catalysts, used to improve the environmental properties of transportation fuels. These compounds may be viewed as 'sulfide-supported carbides' in their catalytically active states. The second class of 'surface compounds' are the 'Maya blue' pigments that are based on technology created by the ancient Maya. These compounds are organic/inorganic 'surface complexes' consisting of the dye indigo and palygorskite, common clay. The identification of both surface compounds relies on the application of synchrotron techniques as described here.

  8. UQ and V&V techniques applied to experiments and simulations of heated pipes pressurized to failure

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Dempsey, J. Franklin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Antoun, Bonnie R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-05-01

    This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.

  9. UQ and V&V techniques applied to experiments and simulations of heated pipes pressurized to failure.

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Vicente Jose; Dempsey, J. Franklin; Antoun, Bonnie R.

    2014-05-01

    This report demonstrates versatile and practical model validation and uncertainty quantification techniques applied to the accuracy assessment of a computational model of heated steel pipes pressurized to failure. The Real Space validation methodology segregates aleatory and epistemic uncertainties to form straightforward model validation metrics especially suited for assessing models to be used in the analysis of performance and safety margins. The methodology handles difficulties associated with representing and propagating interval and/or probabilistic uncertainties from multiple correlated and uncorrelated sources in the experiments and simulations including: material variability characterized by non-parametric random functions (discrete temperature dependent stress-strain curves); very limited (sparse) experimental data at the coupon testing level for material characterization and at the pipe-test validation level; boundary condition reconstruction uncertainties from spatially sparse sensor data; normalization of pipe experimental responses for measured input-condition differences among tests and for random and systematic uncertainties in measurement/processing/inference of experimental inputs and outputs; numerical solution uncertainty from model discretization and solver effects.

  10. Effect of the reinforcement bar arrangement on the efficiency of electrochemical chloride removal technique applied to reinforced concrete structures

    Energy Technology Data Exchange (ETDEWEB)

    Garces, P. [Dpto. Ing. de la Construccion, Obras Publicas e Infraestructura Urbana, Universidad de Alicante. Apdo. 99, E-03080 Alicante (Spain)]. E-mail: pedro.garces@ua.es; Sanchez de Rojas, M.J. [Dpto. Ing. de la Construccion, Obras Publicas e Infraestructura Urbana, Universidad de Alicante. Apdo. 99, E-03080 Alicante (Spain); Climent, M.A. [Dpto. Ing. de la Construccion, Obras Publicas e Infraestructura Urbana, Universidad de Alicante. Apdo. 99, E-03080 Alicante (Spain)

    2006-03-15

    This paper reports on the research done to find out the effect that different bar arrangements may have on the efficiency of the electrochemical chloride removal (ECR) technique when applied to a reinforced concrete structural member. Five different types of bar arrangements were considered, corresponding to typical structural members such as columns (with single and double bar reinforcing), slabs, beams and footings. ECR was applied in several steps. We observe that the extraction efficiency depends on the reinforcing bar arrangement. A uniform layer set-up favours chloride extraction. Electrochemical techniques were also used to estimate the reinforcing bar corrosion states, as well as measure the corrosion potential, and instant corrosion rate based on the polarization resistance technique. After ECR treatment, a reduction in the corrosion levels is observed falling short of the depassivation threshold.

  11. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    Directory of Open Access Journals (Sweden)

    Árpád Gyéresi

    2013-02-01

    Full Text Available Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  12. Principles of micellar electrokinetic capillary chromatography applied in pharmaceutical analysis.

    Science.gov (United States)

    Hancu, Gabriel; Simon, Brigitta; Rusu, Aura; Mircia, Eleonora; Gyéresi, Arpád

    2013-01-01

    Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  13. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  14. A Survey on Data Mining Techniques Applied to Electricity-Related Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Francisco Martínez-Álvarez

    2015-11-01

    Full Text Available Data mining has become an essential tool during the last decade to analyze large sets of data. The variety of techniques it includes and the successful results obtained in many application fields, make this family of approaches powerful and widely used. In particular, this work explores the application of these techniques to time series forecasting. Although classical statistical-based methods provides reasonably good results, the result of the application of data mining outperforms those of classical ones. Hence, this work faces two main challenges: (i to provide a compact mathematical formulation of the mainly used techniques; (ii to review the latest works of time series forecasting and, as case study, those related to electricity price and demand markets.

  15. New Control Technique Applied in Dynamic Voltage Restorer for Voltage Sag Mitigation

    Directory of Open Access Journals (Sweden)

    Rosli Omar

    2010-01-01

    Full Text Available The Dynamic Voltage Restorer (DVR was a power electronics device that was able to compensate voltage sags on critical loads dynamically. The DVR consists of VSC, injection transformers, passive filters and energy storage (lead acid battery. By injecting an appropriate voltage, the DVR restores a voltage waveform and ensures constant load voltage. There were so many types of the control techniques being used in DVR for mitigating voltage sags. The efficiency of the DVR depends on the efficiency of the control technique involved in switching the inverter. Problem statement: Simulation and experimental investigation toward new algorithms development based on SVPWM. Understanding the nature of DVR and performance comparisons between the various controller technologies available. The proposed controller using space vector modulation techniques obtain higher amplitude modulation indexes if compared with conventional SPWM techniques. Moreover, space vector modulation techniques can be easily implemented using digital processors. Space vector PWM can produce about 15% higher output voltage than standard Sinusoidal PWM. Approach: The purpose of this research was to study the implementation of SVPWM in DVR. The proposed control algorithm was investigated through computer simulation by using PSCAD/EMTDC software. Results: From simulation and experimental results showed the effectiveness and efficiency of the proposed controller based on SVPWM in mitigating voltage sags in low voltage distribution systems. It was concluded that its controller also works well both in balance and unbalance conditions of voltages. Conclusion/Recommendations: The simulation and experimental results of a DVR using PSCAD/EMTDC software based on SVPWM technique showed clearly the performance of the DVR in mitigating voltage sags. The DVR operates without any difficulties to inject the appropriate voltage component to correct rapidly any anomaly in the supply voltage to keep the

  16. To Apply or Not to Apply: A Survey Analysis of Grant Writing Costs and Benefits

    CERN Document Server

    von Hippel, Ted

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants a...

  17. Classification Techniques for Multivariate Data Analysis.

    Science.gov (United States)

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  18. Applied predictive analytics principles and techniques for the professional data analyst

    CERN Document Server

    Abbott, Dean

    2014-01-01

    Learn the art and science of predictive analytics - techniques that get results Predictive analytics is what translates big data into meaningful, usable business information. Written by a leading expert in the field, this guide examines the science of the underlying algorithms as well as the principles and best practices that govern the art of predictive analytics. It clearly explains the theory behind predictive analytics, teaches the methods, principles, and techniques for conducting predictive analytics projects, and offers tips and tricks that are essential for successful predictive mode

  19. Applied Techniques for High Bandwidth Data Transfers across Wide Area Networks

    Institute of Scientific and Technical Information of China (English)

    JasonLee; BillAllcock; 等

    2001-01-01

    Large distributed systems such as Computational/Data Grids require large amounts of data to be co-located with the computing facilities for processing.From our work develogpin a scalable distributed network cache.we have gained experience with techniques necessary to achieve high data throughput over high bandwidth Wide Area Networks(WAN).In this paper,we discuss several hardware and software dsign techniques,and then describe their application to an implementation of an enhanced FTP protocol called GridFTP,We describe results from the Supercomputing 2000 conference.

  20. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support.

    Science.gov (United States)

    Anderson, Cynthia M; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis.

  1. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights in the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.

  2. X-ray micro-beam techniques and phase contrast tomography applied to biomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Fratini, Michela, E-mail: michela.fratini@gmail.com [Museo Storico della Fisica e Centro Studi e Ricerche Enrico Fermi, 00184 Roma (Italy); Dipartimento di Scienze, Università di Roma Tre, 00144 Roma (Italy); Campi, Gaetano [Institute of Crystallography, CNR, 00015 Monterotondo, Roma (Italy); Bukreeva, Inna [CNR NANOTEC-Institute of Nanotechnology, 00195 Roma (Italy); P.N. Lebedev Physical Institute RAS, 119991 Moscow (Russian Federation); Pelliccia, Daniele [School of Physics, Monash University, Victoria 3800 (Australia); Burghammer, Manfred [ESRF-The European Synchrotron, 3800 Grenoble (France); Tromba, Giuliana [Sincrotrone Trieste SCpA, 34149 Basovizza, Trieste (Italy); Cancedda, Ranieri; Mastrogiacomo, Maddalena [Dipartimento di Medicina Sperimentale dell’Università di Genova & AUO San Martino-IST Istituto Nazionale per la Ricerca sul Cancro, 16132 Genova (Italy); Cedola, Alessia [CNR NANOTEC-Institute of Nanotechnology, 00195 Roma (Italy)

    2015-12-01

    A deeper comprehension of the biomineralization (BM) process is at the basis of tissue engineering and regenerative medicine developments. Several in-vivo and in-vitro studies were dedicated to this purpose via the application of 2D and 3D diagnostic techniques. Here, we develop a new methodology, based on different complementary experimental techniques (X-ray phase contrast tomography, micro-X-ray diffraction and micro-X-ray fluorescence scanning technique) coupled to new analytical tools. A qualitative and quantitative structural investigation, from the atomic to the micrometric length scale, is obtained for engineered bone tissues. The high spatial resolution achieved by X-ray scanning techniques allows us to monitor the bone formation at the first-formed mineral deposit at the organic–mineral interface within a porous scaffold. This work aims at providing a full comprehension of the morphology and functionality of the biomineralization process, which is of key importance for developing new drugs for preventing and healing bone diseases and for the development of bio-inspired materials.

  3. Study of Phase Reconstruction Techniques applied to Smith-Purcell Radiation Measurements

    CERN Document Server

    Delerue, Nicolas; Bezshyyko, Oleg; Khodnevych, Vitalii

    2015-01-01

    Measurements of coherent radiation at accelerators typically give the absolute value of the beam profile Fourier transform but not its phase. Phase reconstruction techniques such as Hilbert transform or Kramers Kronig reconstruction are used to recover such phase. We report a study of the performances of these methods and how to optimize the reconstructed profiles.

  4. Study of Phase Reconstruction Techniques applied to Smith-Purcell Radiation Measurements

    CERN Document Server

    Delerue, Nicolas; Vieille-Grosjean, Mélissa; Bezshyyko, Oleg; Khodnevych, Vitalii

    2014-01-01

    Measurements of coherent radiation at accelerators typically give the absolute value of the beam profile Fourier transform but not its phase. Phase reconstruction techniques such as Hilbert transform or Kramers Kronig reconstruction are used to recover such phase. We report a study of the performances of these methods and how to optimize the reconstructed profiles.

  5. A Technical Review of Electrochemical Techniques Applied to Microbiologically Influenced Corrosion

    Science.gov (United States)

    1991-01-01

    in the literature for the study of MIC phenomena. Videla65 has used this technique in a study of the action of Cladosporium resinae growth on the...ROSALES, Corrosion 44, 638 (1988). 65. H. A. VIDs, The action of Clado.sporiuo resinae growth on the electrochemical behavior of aluminum. Proc. bit. Conf

  6. X-ray micro-beam techniques and phase contrast tomography applied to biomaterials

    Science.gov (United States)

    Fratini, Michela; Campi, Gaetano; Bukreeva, Inna; Pelliccia, Daniele; Burghammer, Manfred; Tromba, Giuliana; Cancedda, Ranieri; Mastrogiacomo, Maddalena; Cedola, Alessia

    2015-12-01

    A deeper comprehension of the biomineralization (BM) process is at the basis of tissue engineering and regenerative medicine developments. Several in-vivo and in-vitro studies were dedicated to this purpose via the application of 2D and 3D diagnostic techniques. Here, we develop a new methodology, based on different complementary experimental techniques (X-ray phase contrast tomography, micro-X-ray diffraction and micro-X-ray fluorescence scanning technique) coupled to new analytical tools. A qualitative and quantitative structural investigation, from the atomic to the micrometric length scale, is obtained for engineered bone tissues. The high spatial resolution achieved by X-ray scanning techniques allows us to monitor the bone formation at the first-formed mineral deposit at the organic-mineral interface within a porous scaffold. This work aims at providing a full comprehension of the morphology and functionality of the biomineralization process, which is of key importance for developing new drugs for preventing and healing bone diseases and for the development of bio-inspired materials.

  7. Reverse Time Migration: A Seismic Imaging Technique Applied to Synthetic Ultrasonic Data

    Directory of Open Access Journals (Sweden)

    Sabine Müller

    2012-01-01

    Full Text Available Ultrasonic echo testing is a more and more frequently used technique in civil engineering to investigate concrete building elements, to measure thickness as well as to locate and characterise built-in components or inhomogeneities. Currently the Synthetic Aperture Focusing Technique (SAFT, which is closely related to Kirchhoff migration, is used in most cases for imaging. However, this method is known to have difficulties to image steeply dipping interfaces as well as lower boundaries of tubes, voids or similar objects. We have transferred a processing technique from geophysics, the Reverse Time Migration (RTM method, to improve the imaging of complicated geometries. By using the information from wide angle reflections as well as from multiple events there are fewer limitations compared to SAFT. As a drawback the required computing power is significantly higher compared to the techniques currently used. Synthetic experiments have been performed on polyamide and concrete specimens to show the improvements compared to SAFT. We have been able to image vertical interfaces of step-like structures as well as the lower boundaries of circular objects. It has been shown that RTM is a step forward for ultrasonic testing in civil engineering.

  8. Trends and Techniques in Visual Gaze Analysis

    CERN Document Server

    Stellmach, Sophie; Dachselt, Raimund; Lindley, Craig A

    2010-01-01

    Visualizing gaze data is an effective way for the quick interpretation of eye tracking results. This paper presents a study investigation benefits and limitations of visual gaze analysis among eye tracking professionals and researchers. The results were used to create a tool for visual gaze analysis within a Master's project.

  9. Analysis of Gopher Tortoise Population Estimation Techniques

    Science.gov (United States)

    2005-10-01

    terrestrial reptile that was once found throughout the southeastern United States from North Carolina into Texas. However, due to numerous factors...et al. 2000, Waddle 2000). Solar energy is used for thermoregulation and egg incubation. Also, tortoises are grazers (Garner and Landers 1981...Evaluation and review of field techniques used to study and manage gopher tortoises.” Pages 205-215 in Management of amphibians, reptiles , and small mammals

  10. Do trained practice nurses apply motivational interviewing techniques in primary care consultations?

    NARCIS (Netherlands)

    Noordman, J.; Lee, I. van der; Nielen, M.; Vlek, H.; Weijden, T. van der; Dulmen, S. van

    2012-01-01

    Background: Reducing the prevalence of unhealthy lifestyle behaviour could positively influence health. Motivational interviewing (MI) is used to promote change in unhealthy lifestyle behaviour as part of primary or secondary prevention. Whether MI is actually applied as taught is unknown. Practice

  11. Do trained practice nurses apply motivational interviewing techniques in primary care consultations?

    NARCIS (Netherlands)

    Noordman, J.; Lee, I. van; Nielen, M.; Vlek, H.; Weijden, T. van; Dulmen, S. van

    2012-01-01

    BACKGROUND: Reducing the prevalence of unhealthy lifestyle behaviour could positively influence health. Motivational interviewing (MI) is used to promote change in unhealthy lifestyle behaviour as part of primary or secondary prevention. Whether MI is actually applied as taught is unknown. Practice

  12. Time-lapse motion picture technique applied to the study of geological processes

    Science.gov (United States)

    Miller, R.D.; Crandell, D.R.

    1959-01-01

    Light-weight, battery-operated timers were built and coupled to 16-mm motion-picture cameras having apertures controlled by photoelectric cells. The cameras were placed adjacent to Emmons Glacier on Mount Rainier. The film obtained confirms the view that exterior time-lapse photography can be applied to the study of slow-acting geologic processes.

  13. Determination of Volatile Organic Compounds in the Atmosphere Using Two Complementary Analysis Techniques.

    Science.gov (United States)

    Alonso, L; Durana, N; Navazo, M; García, J A; Ilardia, J L

    1999-08-01

    During a preliminary field campaign of volatile organic compound (VOC) measurements carried out in an urban area, two complementary analysis techniques were applied to establish the technical and scientific bases for a strategy to monitor and control VOCs and photochemical oxidants in the Autonomous Community of the Basque Country. Integrated sampling was conducted using Tenax sorbent tubes and laboratory analysis by gas chromatography, and grab sampling and in situ analysis also were conducted using a portable gas chromatograph. With the first technique, monocyclic aromatic hydrocarbons appeared as the compounds with the higher mean concentrations. The second technique allowed the systematic analysis of eight chlorinated and aromatic hydrocarbons. Results of comparing both techniques, as well as the additional information obtained with the second technique, are included.

  14. Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.

    Science.gov (United States)

    Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic

    2017-02-16

    Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes.

  15. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  16. Identifying Engineering Students' English Sentence Reading Comprehension Errors: Applying a Data Mining Technique

    Science.gov (United States)

    Tsai, Yea-Ru; Ouyang, Chen-Sen; Chang, Yukon

    2016-01-01

    The purpose of this study is to propose a diagnostic approach to identify engineering students' English reading comprehension errors. Student data were collected during the process of reading texts of English for science and technology on a web-based cumulative sentence analysis system. For the analysis, the association-rule, data mining technique…

  17. Modelling laser speckle photographs of decayed teeth by applying a digital image information technique

    Science.gov (United States)

    Ansari, M. Z.; da Silva, L. C.; da Silva, J. V. P.; Deana, A. M.

    2016-09-01

    We report on the application of a digital image model to assess early carious lesions on teeth. When decay is in its early stages, the lesions were illuminated with a laser and the laser speckle images were obtained. Due to the differences in the optical properties between healthy and carious tissue, both regions produced different scatter patterns. The digital image information technique allowed us to produce colour-coded 3D surface plots of the intensity information in the speckle images, where the height (on the z-axis) and the colour in the rendering correlate with the intensity of a pixel in the image. The quantitative changes in colour component density enhance the contrast between the decayed and sound tissue, and visualization of the carious lesions become significantly evident. Therefore, the proposed technique may be adopted in the early diagnosis of carious lesions.

  18. Wavelet-based Adaptive Techniques Applied to Turbulent Hypersonic Scramjet Intake Flows

    CERN Document Server

    Frauholz, Sarah; Reinartz, Birgit U; Müller, Siegfried; Behr, Marek

    2013-01-01

    The simulation of hypersonic flows is computationally demanding due to large gradients of the flow variables caused by strong shock waves and thick boundary or shear layers. The resolution of those gradients imposes the use of extremely small cells in the respective regions. Taking turbulence into account intensives the variation in scales even more. Furthermore, hypersonic flows have been shown to be extremely grid sensitive. For the simulation of three-dimensional configurations of engineering applications, this results in a huge amount of cells and prohibitive computational time. Therefore, modern adaptive techniques can provide a gain with respect to computational costs and accuracy, allowing the generation of locally highly resolved flow regions where they are needed and retaining an otherwise smooth distribution. An h-adaptive technique based on wavelets is employed for the solution of hypersonic flows. The compressible Reynolds averaged Navier-Stokes equations are solved using a differential Reynolds s...

  19. Magnetic Resonance Techniques Applied to the Diagnosis and Treatment of Parkinson’s Disease

    Science.gov (United States)

    de Celis Alonso, Benito; Hidalgo-Tobón, Silvia S.; Menéndez-González, Manuel; Salas-Pacheco, José; Arias-Carrión, Oscar

    2015-01-01

    Parkinson’s disease (PD) affects at least 10 million people worldwide. It is a neurodegenerative disease, which is currently diagnosed by neurological examination. No neuroimaging investigation or blood biomarker is available to aid diagnosis and prognosis. Most effort toward diagnosis using magnetic resonance (MR) has been focused on the use of structural/anatomical neuroimaging and diffusion tensor imaging (DTI). However, deep brain stimulation, a current strategy for treating PD, is guided by MR imaging (MRI). For clinical prognosis, diagnosis, and follow-up investigations, blood oxygen level-dependent MRI, DTI, spectroscopy, and transcranial magnetic stimulation have been used. These techniques represent the state of the art in the last 5 years. Here, we focus on MR techniques for the diagnosis and treatment of Parkinson’s disease. PMID:26191037

  20. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    Science.gov (United States)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  1. Microstrip coupling techniques applied to thin-film Josephson junctions at microwave frequencies

    DEFF Research Database (Denmark)

    Sørensen, O H; Pedersen, Niels Falsig; Mygind, Jesper

    1981-01-01

    Three different schemes for coupling to low impedance Josephson devices have been investigated. They all employ superconducting thin-film microstrip circuit techniques. The schemes are: (i) a quarterwave stepped impedance transformer, (ii) a microstrip resonator, (iii) an adjustable impedance...... transformer in inverted microstrip. Using single microbridges to probe the performance we found that the most primising scheme in terms of coupling efficiency and useful bandwidth was the adjustable inverted microstrip transformer....

  2. Improving throughput and user experience for information intensive websites by applying HTTP compression technique.

    Science.gov (United States)

    Malla, Ratnakar

    2008-11-06

    HTTP compression is a technique specified as part of the W3C HTTP 1.0 standard. It allows HTTP servers to take advantage of GZIP compression technology that is built into latest browsers. A brief survey of medical informatics websites show that compression is not enabled. With compression enabled, downloaded files sizes are reduced by more than 50% and typical transaction time is also reduced from 20 to 8 minutes, thus providing a better user experience.

  3. A neuro-evolutive technique applied for predicting the liquid crystalline property of some organic compounds

    Science.gov (United States)

    Drăgoi, Elena-Niculina; Curteanu, Silvia; Lisa, Cătălin

    2012-10-01

    A simple self-adaptive version of the differential evolution algorithm was applied for simultaneous architectural and parametric optimization of feed-forward neural networks, used to classify the crystalline liquid property of a series of organic compounds. The developed optimization methodology was called self-adaptive differential evolution neural network (SADE-NN) and has the following characteristics: the base vector used is chosen as the best individual in the current population, two differential terms participate in the mutation process, the crossover type is binomial, a simple self-adaptive mechanism is employed to determine the near-optimal control parameters of the algorithm, and the integration of the neural network into the differential evolution algorithm is performed using a direct encoding scheme. It was found that a network with one hidden layer is able to make accurate predictions, indicating that the proposed methodology is efficient and, owing to its flexibility, it can be applied to a large range of problems.

  4. IPR techniques applied to a multimedia environment in the HYPERMEDIA project

    Science.gov (United States)

    Munoz, Alberto; Ribagorda, Arturo; Sierra, Jose M.

    1999-04-01

    Watermarking techniques have been proved as a good method to protect intellectual copyrights in digital formats. But the simplicity for processing information supplied by digital platforms also offers many chances for eliminating marks embedded in the data due to the wide variety of techniques to modify information in digital formats. This paper analyzes a selection of the most interesting methods for image watermarking in order to test its qualities. The comparison of these watermarking techniques has shown new interesting lines of work. Some changes and extensions to these methods are proposed to increase its robustness against some usual attacks and specific watermark attacks. This works has been realized in order to provide the HYPERMEDIA project with an efficient tool for protecting IPR. The objective of this project is to establish an experimental stage on continuous multimedia material (audiovisuals) handling and delivering in a multimedia service environment, allowing the user to navigate in the hyperspace through database which belong to actors of the service chain and protecting IPR of authors or owners.

  5. An efficient permeability scaling-up technique applied to the discretized flow equations

    Energy Technology Data Exchange (ETDEWEB)

    Urgelli, D.; Ding, Yu [Institut Francais du Petrole, Rueil Malmaison (France)

    1997-08-01

    Grid-block permeability scaling-up for numerical reservoir simulations has been discussed for a long time in the literature. It is now recognized that a full permeability tensor is needed to get an accurate reservoir description at large scale. However, two major difficulties are encountered: (1) grid-block permeability cannot be properly defined because it depends on boundary conditions; (2) discretization of flow equations with a full permeability tensor is not straightforward and little work has been done on this subject. In this paper, we propose a new method, which allows us to get around both difficulties. As the two major problems are closely related, a global approach will preserve the accuracy. So, in the proposed method, the permeability up-scaling technique is integrated in the discretized numerical scheme for flow simulation. The permeability is scaled-up via the transmissibility term, in accordance with the fluid flow calculation in the numerical scheme. A finite-volume scheme is particularly studied, and the transmissibility scaling-up technique for this scheme is presented. Some numerical examples are tested for flow simulation. This new method is compared with some published numerical schemes for full permeability tensor discretization where the full permeability tensor is scaled-up through various techniques. Comparing the results with fine grid simulations shows that the new method is more accurate and more efficient.

  6. A photoacoustic technique applied to detection of ethylene emissions in edible coated passion fruit

    Energy Technology Data Exchange (ETDEWEB)

    Alves, G V L; Santos, W C dos; Vargas, H; Silva, M G da [Laboratorio de Ciencias FIsicas, Universidade Estadual do Norte Fluminense Darcy Ribeiro, Av. Alberto Lamego 2000, 28013-602, Campos dos Goytacazes, RJ (Brazil); Waldman, W R [Laboratorio de Ciencias QuImicas, Universidade Estadual do Norte Fluminense Darcy Ribeiro (Brazil); Oliveira, J G, E-mail: mgs@uenf.b [Laboratorio de Melhoramento Genetico Vegetal, Universidade Estadual do Norte Fluminense Darcy Ribeiro (Brazil)

    2010-03-01

    Photoacoustic spectroscopy was applied to study the physiological behavior of passion fruit when coated with edible films. The results have shown a reduction of the ethylene emission rate. Weight loss monitoring has not shown any significant differences between the coated and uncoated passion fruit. On the other hand, slower color changes of coated samples suggest a slowdown of the ripening process in coated passion fruit.

  7. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Science.gov (United States)

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  8. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Directory of Open Access Journals (Sweden)

    Ted von Hippel

    Full Text Available We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  9. Correlation network analysis applied to complex biofilm communities.

    Directory of Open Access Journals (Sweden)

    Ana E Duran-Pinedo

    Full Text Available The complexity of the human microbiome makes it difficult to reveal organizational principles of the community and even more challenging to generate testable hypotheses. It has been suggested that in the gut microbiome species such as Bacteroides thetaiotaomicron are keystone in maintaining the stability and functional adaptability of the microbial community. In this study, we investigate the interspecies associations in a complex microbial biofilm applying systems biology principles. Using correlation network analysis we identified bacterial modules that represent important microbial associations within the oral community. We used dental plaque as a model community because of its high diversity and the well known species-species interactions that are common in the oral biofilm. We analyzed samples from healthy individuals as well as from patients with periodontitis, a polymicrobial disease. Using results obtained by checkerboard hybridization on cultivable bacteria we identified modules that correlated well with microbial complexes previously described. Furthermore, we extended our analysis using the Human Oral Microbe Identification Microarray (HOMIM, which includes a large number of bacterial species, among them uncultivated organisms present in the mouth. Two distinct microbial communities appeared in healthy individuals while there was one major type in disease. Bacterial modules in all communities did not overlap, indicating that bacteria were able to effectively re-associate with new partners depending on the environmental conditions. We then identified hubs that could act as keystone species in the bacterial modules. Based on those results we then cultured a not-yet-cultivated microorganism, Tannerella sp. OT286 (clone BU063. After two rounds of enrichment by a selected helper (Prevotella oris OT311 we obtained colonies of Tannerella sp. OT286 growing on blood agar plates. This system-level approach would open the possibility of

  10. Handbook of Systems Analysis: Volume 1. Overview. Chapter 2. The Genesis of Applied Systems Analysis

    OpenAIRE

    1981-01-01

    The International Institute for Applied Systems Analysis is preparing a Handbook of Systems Analysis, which will appear in three volumes: Volume 1: Overview is aimed at a widely varied audience of producers and users of systems analysis studies. Volume 2: Methods is aimed at systems analysts and other members of systems analysis teams who need basic knowledge of methods in which they are not expert; this volume contains introductory overviews of such methods. Volume 3: Cases co...

  11. Assessment of ground-based monitoring techniques applied to landslide investigations

    Science.gov (United States)

    Uhlemann, S.; Smith, A.; Chambers, J.; Dixon, N.; Dijkstra, T.; Haslam, E.; Meldrum, P.; Merritt, A.; Gunn, D.; Mackay, J.

    2016-01-01

    A landslide complex in the Whitby Mudstone Formation at Hollin Hill, North Yorkshire, UK is periodically re-activated in response to rainfall-induced pore-water pressure fluctuations. This paper compares long-term measurements (i.e., 2009-2014) obtained from a combination of monitoring techniques that have been employed together for the first time on an active landslide. The results highlight the relative performance of the different techniques, and can provide guidance for researchers and practitioners for selecting and installing appropriate monitoring techniques to assess unstable slopes. Particular attention is given to the spatial and temporal resolutions offered by the different approaches that include: Real Time Kinematic-GPS (RTK-GPS) monitoring of a ground surface marker array, conventional inclinometers, Shape Acceleration Arrays (SAA), tilt meters, active waveguides with Acoustic Emission (AE) monitoring, and piezometers. High spatial resolution information has allowed locating areas of stability and instability across a large slope. This has enabled identification of areas where further monitoring efforts should be focused. High temporal resolution information allowed the capture of 'S'-shaped slope displacement-time behaviour (i.e. phases of slope acceleration, deceleration and stability) in response to elevations in pore-water pressures. This study shows that a well-balanced suite of monitoring techniques that provides high temporal and spatial resolutions on both measurement and slope scale is necessary to fully understand failure and movement mechanisms of slopes. In the case of the Hollin Hill landslide it enabled detailed interpretation of the geomorphological processes governing landslide activity. It highlights the benefit of regularly surveying a network of GPS markers to determine areas for installation of movement monitoring techniques that offer higher resolution both temporally and spatially. The small sensitivity of tilt meter measurements

  12. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  13. Applying DNA computation to intractable problems in social network analysis.

    Science.gov (United States)

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA.

  14. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  15. Enhanced nonlinear iterative techniques applied to a non-equilibrium plasma flow

    Energy Technology Data Exchange (ETDEWEB)

    Knoll, D.A.; McHugh, P.R. [Idaho National Engineering Lab., Idaho Falls, ID (United States)

    1996-12-31

    We study the application of enhanced nonlinear iterative methods to the steady-state solution of a system of two-dimensional convection-diffusion-reaction partial differential equations that describe the partially-ionized plasma flow in the boundary layer of a tokamak fusion reactor. This system of equations is characterized by multiple time and spatial scales, and contains highly anisotropic transport coefficients due to a strong imposed magnetic field. We use Newton`s method to linearize the nonlinear system of equations resulting from an implicit, finite volume discretization of the governing partial differential equations, on a staggered Cartesian mesh. The resulting linear systems are neither symmetric nor positive definite, and are poorly conditioned. Preconditioned Krylov iterative techniques are employed to solve these linear systems. We investigate both a modified and a matrix-free Newton-Krylov implementation, with the goal of reducing CPU cost associated with the numerical formation of the Jacobian. A combination of a damped iteration, one-way multigrid and a pseudo-transient continuation technique are used to enhance global nonlinear convergence and CPU efficiency. GMRES is employed as the Krylov method with Incomplete Lower-Upper(ILU) factorization preconditioning. The goal is to construct a combination of nonlinear and linear iterative techniques for this complex physical problem that optimizes trade-offs between robustness, CPU time, memory requirements, and code complexity. It is shown that a one-way multigrid implementation provides significant CPU savings for fine grid calculations. Performance comparisons of the modified Newton-Krylov and matrix-free Newton-Krylov algorithms will be presented.

  16. Applying Intelligent Computing Techniques to Modeling Biological Networks from Expression Data

    Institute of Scientific and Technical Information of China (English)

    Wei-Po Lee; Kung-Cheng Yang

    2008-01-01

    Constructing biological networks is one of the most important issues in system sbiology. However, constructing a network from data manually takes a considerable large amount of time, therefore an automated procedure is advocated. To automate the procedure of network construction, in this work we use two intelligent computing techniques, genetic programming and neural computation, to infer two kinds of network models that use continuous variables. To verify the presented approaches, experiments have been conducted and the preliminary results show that both approaches can be used to infer networks successfully.

  17. Applying machine learning techniques for forecasting flexibility of virtual power plants

    DEFF Research Database (Denmark)

    MacDougall, Pamela; Kosek, Anna Magdalena; Bindner, Henrik W.

    2016-01-01

    Previous and existing evaluations of available flexibility using small device demand response have typically been done with detailed information of end-user systems. With these large numbers, having lower level information has both privacy and computational limitations. We propose a black box...... hidden layer artificial neural network (ANN). Both techniques are used to model a relationship between the aggregator portfolio state and requested ramp power to the longevity of the delivered flexibility. Using validated individual household models, a smart controlled aggregated virtual power plant...

  18. Applying stakeholder Delphi techniques for planning sustainable use of aquatic resources

    DEFF Research Database (Denmark)

    Lund, Søren; Banta, Gary Thomas; Bunting, Stuart W

    2015-01-01

    and Vietnam. The purpose of this paper is to give an account of how the stakeholder Delphi method was adapted and applied to support the participatory integrated action planning for sustainable use of aquatic resources facilitated within the HighARCS project. An account of the steps taken and results recorded...... of the stakeholder Delphi requires the presence of multidisciplinary and facilitating skills and competences within the implementing teams which should be considered before deciding to include a Stakeholder Delphi as a decision-making tool...

  19. Comparison of Hydrogen Sulfide Analysis Techniques

    Science.gov (United States)

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  20. Fragrance composition of Dendrophylax lindenii (Orchidaceae using a novel technique applied in situ

    Directory of Open Access Journals (Sweden)

    James J. Sadler

    2012-02-01

    Full Text Available The ghost orchid, Dendrophylax lindenii (Lindley Bentham ex Rolfe (Orchidaceae, is one of North America’s rarest and well-known orchids. Native to Cuba and SW Florida where it frequents shaded swamps as an epiphyte, the species has experienced steady decline. Little information exists on D. lindenii’s biology in situ, raising conservation concerns. During the summer of 2009 at an undisclosed population in Collier County, FL, a substantial number (ca. 13 of plants initiated anthesis offering a unique opportunity to study this species in situ. We report a new technique aimed at capturing floral headspace of D. lindenii in situ, and identified volatile compounds using gas chromatography mass spectrometry (GC/MS. All components of the floral scent were identified as terpenoids with the exception of methyl salicylate. The most abundant compound was the sesquiterpene (E,E-α-farnesene (71% followed by (E-β-ocimene (9% and methyl salicylate (8%. Other compounds were: linalool (5%, sabinene (4%, (E-α-bergamotene (2%, α-pinene (1%, and 3-carene (1%. Interestingly, (E,E-α-farnesene has previously been associated with pestiferous insects (e.g., Hemiptera. The other compounds are common floral scent constituents in other angiosperms suggesting that our in situ technique was effective. Volatile capture was, therefore, possible without imposing physical harm (e.g., inflorescence detachment to this rare orchid.

  1. Therapeutic techniques applied in the heavy-ion therapy at IMP

    Science.gov (United States)

    Li, Qiang; Sihver, Lembit

    2011-04-01

    Superficially-placed tumors have been treated with carbon ions at the Institute of Modern Physics (IMP), Chinese Academy of Sciences (CAS), since November 2006. Up to now, 103 patients have been irradiated in the therapy terminal of the heavy ion research facility in Lanzhou (HIRFL) at IMP, where carbon-ion beams with energies up to 100 MeV/u can be supplied and a passive beam delivery system has been developed and commissioned. A number of therapeutic and clinical experiences concerning heavy-ion therapy have been acquired at IMP. To extend the heavy-ion therapy project to deep-seated tumor treatment, a horizontal beam line dedicated to this has been constructed in the cooling storage ring (CSR), which is a synchrotron connected to the HIRFL as an injector, and is now in operation. Therapeutic high-energy carbon-ion beams, extracted from the HIRFL-CSR through slow extraction techniques, have been supplied in the deep-seated tumor therapy terminal. After the beam delivery, shaping and monitoring devices installed in the therapy terminal at HIRFL-CSR were validated through therapeutic beam tests, deep-seated tumor treatment with high-energy carbon ions started in March 2009. The therapeutic techniques in terms of beam delivery system, conformal irradiation method and treatment planning used at IMP are introduced in this paper.

  2. Applying Reflective Middleware Techniques to Optimize a QoS-enabled CORBA Component Model Implementation

    Science.gov (United States)

    Wang, Nanbor; Parameswaran, Kirthika; Kircher, Michael; Schmidt, Douglas

    2003-01-01

    Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and open sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration framework for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of-service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines rejective middleware techniques designed to adaptively (1) select optimal communication mechanisms, (2) manage QoS properties of CORBA components in their contain- ers, and (3) (re)con$gure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of rejective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.

  3. Applying Tiab’s direct synthesis technique to dilatant non-Newtonian/Newtonian fluids

    Directory of Open Access Journals (Sweden)

    Javier Andrés Martínez

    2011-08-01

    Full Text Available Non-Newtonian fluids, such as polymer solutions, have been used by the oil industry for many years as fracturing agents and drilling mud. These solutions, which normally include thickened water and jelled fluids, are injected into the formation to enhanced oil recovery by improving sweep efficiency. It is worth noting that some heavy oils behave non-Newtonianly. Non-Newtonian fluids do not have direct proportionality between applied shear stress and shear rate and viscosity varies with shear rate depending on whether the fluid is either pseudoplastic or dilatant. Viscosity decreases as shear rate increases for the former whilst the reverse takes place for dilatants. Mathematical models of conventional fluids thus fail when applied to non-Newtonian fluids. The pressure derivative curve is introduced in this descriptive work for a dilatant fluid and its pattern was observed. Tiab’s direct synthesis (TDS methodology was used as a tool for interpreting pressure transient data to estimate effective permeability, skin factors and non-Newtonian bank radius. The methodology was successfully verified by its application to synthetic examples. Also, comparing it to pseudoplastic behavior, it was found that the radial flow regime in the Newtonian zone of dilatant fluids took longer to form regarding both the flow behavior index and consistency factor.

  4. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Matthew W. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include the inherently weak Raman cross section and susceptibility to fluorescence interference.

  5. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  6. Pattern recognition software and techniques for biological image analysis.

    Science.gov (United States)

    Shamir, Lior; Delaney, John D; Orlov, Nikita; Eckley, D Mark; Goldberg, Ilya G

    2010-11-24

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  7. Acoustic emission partial discharge detection technique applied to fault diagnosis: Case studies of generator transformers

    Directory of Open Access Journals (Sweden)

    Shanker Tangella Bhavani

    2016-01-01

    Full Text Available In power transformers, locating the partial discharge (PD source is as important as identifying it. Acoustic Emission (AE sensing offers a good solution for both PD detection and PD source location identification. In this paper the principle of the AE technique, along with in-situ findings of the online acoustic emission signals captured from partial discharges on a number of Generator Transformers (GT, is discussed. Of the two cases discussed, the first deals with Acoustic Emission Partial Discharge (AEPD tests on two identical transformers, and the second deals with the AEPD measurement of a transformer carried out on different occasions (years. These transformers are from a hydropower station and a thermal power station in India. Tests conducted in identical transformers give the provision for comparing AE signal amplitudes from the two transformers. These case studies also help in comprehending the efficacy of integrating Dissolved Gas is (DGA data with AEPD test results in detecting and locating the PD source.

  8. An Encoding Technique for Multiobjective Evolutionary Algorithms Applied to Power Distribution System Reconfiguration

    Directory of Open Access Journals (Sweden)

    J. L. Guardado

    2014-01-01

    Full Text Available Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2 and the Nondominated Sorting Genetic Algorithm II (NSGA-II. The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  9. GPU peer-to-peer techniques applied to a cluster interconnect

    CERN Document Server

    Ammendola, Roberto; Biagioni, Andrea; Bisson, Mauro; Fatica, Massimiliano; Frezza, Ottorino; Cicero, Francesca Lo; Lonardo, Alessandro; Mastrostefano, Enrico; Paolucci, Pier Stanislao; Rossetti, Davide; Simula, Francesco; Tosoratto, Laura; Vicini, Piero

    2013-01-01

    Modern GPUs support special protocols to exchange data directly across the PCI Express bus. While these protocols could be used to reduce GPU data transmission times, basically by avoiding staging to host memory, they require specific hardware features which are not available on current generation network adapters. In this paper we describe the architectural modifications required to implement peer-to-peer access to NVIDIA Fermi- and Kepler-class GPUs on an FPGA-based cluster interconnect. Besides, the current software implementation, which integrates this feature by minimally extending the RDMA programming model, is discussed, as well as some issues raised while employing it in a higher level API like MPI. Finally, the current limits of the technique are studied by analyzing the performance improvements on low-level benchmarks and on two GPU-accelerated applications, showing when and how they seem to benefit from the GPU peer-to-peer method.

  10. Emerging and Innovative Techniques for Arsenic Removal Applied to a Small Water Supply System

    Directory of Open Access Journals (Sweden)

    António J. Alçada

    2009-12-01

    Full Text Available The impact of arsenic on human health has led its drinking water MCL to be drastically reduced from 50 to 10 ppb. Consequently, arsenic levels in many water supply sources have become critical. This has resulted in technical and operational impacts on many drinking water treatment plants that have required onerous upgrading to meet the new standard. This becomes a very sensitive issue in the context of water scarcity and climate change, given the expected increasing demand on groundwater sources. This work presents a case study that describes the development of low-cost techniques for efficient arsenic control in drinking water. The results obtained at the Manteigas WTP (Portugal demonstrate the successful implementation of an effective and flexible process of reactive filtration using iron oxide. At real-scale, very high removal efficiencies of over 95% were obtained.

  11. Study of different filtering techniques applied to spectra from airborne gamma spectrometry.

    Science.gov (United States)

    Wilhelm, Emilien; Gutierrez, Sébastien; Arbor, Nicolas; Ménard, Stéphanie; Nourreddine, Abdel-Mjid

    2016-11-01

    One of the features of the spectra obtained by airborne gamma spectrometry is the low counting statistics due to a short acquisition time (1 s) and a large source-detector distance (40 m) which leads to large statistical fluctuations. These fluctuations bring large uncertainty in radionuclide identification and determination of their respective activities from the window method recommended by the IAEA, especially for low-level radioactivity. Different types of filter could be used on spectra in order to remove these statistical fluctuations. The present work compares the results obtained with filters in terms of errors over the whole gamma energy range of the filtered spectra with the window method. These results are used to determine which filtering technique is the most suitable in combination with some method for total stripping of the spectrum.

  12. Synchrotron radiation X-ray powder diffraction techniques applied in hydrogen storage materials - A review

    Directory of Open Access Journals (Sweden)

    Honghui Cheng

    2017-02-01

    Full Text Available Synchrotron radiation is an advanced collimated light source with high intensity. It has particular advantages in structural characterization of materials on the atomic or molecular scale. Synchrotron radiation X-ray powder diffraction (SR-XRPD has been successfully exploited to various areas of hydrogen storage materials. In the paper, we will give a brief introduction on hydrogen storage materials, X-ray powder diffraction (XRPD, and synchrotron radiation light source. The applications of ex situ and in situ time-resolved SR-XRPD in hydrogen storage materials, are reviewed in detail. Future trends and proposals in the applications of the advanced XRPD techniques in hydrogen storage materials are also discussed.

  13. Applying the behaviour change technique (BCT) taxonomy v1: a study of coder training.

    Science.gov (United States)

    Wood, Caroline E; Richardson, Michelle; Johnston, Marie; Abraham, Charles; Francis, Jill; Hardeman, Wendy; Michie, Susan

    2015-06-01

    Behaviour Change Technique Taxonomy v1 (BCTTv1) has been used to detect active ingredients of interventions. The purpose of this study was to evaluate effectiveness of user training in improving reliable, valid and confident application of BCTTv1 to code BCTs in intervention descriptions. One hundred sixty-one trainees (109 in workshops and 52 in group tutorials) were trained to code frequent BCTs. The following measures were taken before and after training: (i) inter-coder agreement, (ii) trainee agreement with expert consensus, (iii) confidence ratings and (iv) coding competence. Coding was assessed for 12 BCTs (workshops) and for 17 BCTs (tutorials). Trainees completed a course evaluation. Methods improved agreement with expert consensus (p coder agreement (p = .08, p = .57, respectively) and increased confidence for BCTs assessed (both p coder agreement. This varied according to BCT.

  14. An encoding technique for multiobjective evolutionary algorithms applied to power distribution system reconfiguration.

    Science.gov (United States)

    Guardado, J L; Rivas-Davalos, F; Torres, J; Maximov, S; Melgoza, E

    2014-01-01

    Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD) technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2) and the Nondominated Sorting Genetic Algorithm II (NSGA-II). The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  15. Research on Key Techniques for Video Surveillance System Applied to Shipping Channel Management

    Institute of Scientific and Technical Information of China (English)

    WANG Lin; ZHUANG Yan-bin; ZHENG Cheng-zeng

    2007-01-01

    A video patrol and inspection system is an important part of the government's shipping channel information management. This system is mainly applied to video information gathering and processing as a patrol is carried out. The system described in this paper can preview, edit, and add essential explanation messages to the collected video data. It then transfers these data and messages to a video server for the leaders and engineering and technical personnel to retrieve, play, chart, download or print. Each department of the government will use the system's functions according to that department's mission. The system can provide an effective means for managing the shipping enterprise. It also provides a valuable reference for the modernizing of waterborne shipping.

  16. Applying data mining for the analysis of breast cancer data.

    Science.gov (United States)

    Liou, Der-Ming; Chang, Wei-Pin

    2015-01-01

    Data mining, also known as Knowledge-Discovery in Databases (KDD), is the process of automatically searching large volumes of data for patterns. For instance, a clinical pattern might indicate a female who have diabetes or hypertension are easier suffered from stroke for 5 years in a future. Then, a physician can learn valuable knowledge from the data mining processes. Here, we present a study focused on the investigation of the application of artificial intelligence and data mining techniques to the prediction models of breast cancer. The artificial neural network, decision tree, logistic regression, and genetic algorithm were used for the comparative studies and the accuracy and positive predictive value of each algorithm were used as the evaluation indicators. 699 records acquired from the breast cancer patients at the University of Wisconsin, nine predictor variables, and one outcome variable were incorporated for the data analysis followed by the tenfold cross-validation. The results revealed that the accuracies of logistic regression model were 0.9434 (sensitivity 0.9716 and specificity 0.9482), the decision tree model 0.9434 (sensitivity 0.9615, specificity 0.9105), the neural network model 0.9502 (sensitivity 0.9628, specificity 0.9273), and the genetic algorithm model 0.9878 (sensitivity 1, specificity 0.9802). The accuracy of the genetic algorithm was significantly higher than the average predicted accuracy of 0.9612. The predicted outcome of the logistic regression model was higher than that of the neural network model but no significant difference was observed. The average predicted accuracy of the decision tree model was 0.9435 which was the lowest of all four predictive models. The standard deviation of the tenfold cross-validation was rather unreliable. This study indicated that the genetic algorithm model yielded better results than other data mining models for the analysis of the data of breast cancer patients in terms of the overall accuracy of

  17. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  18. New trends in applied harmonic analysis sparse representations, compressed sensing, and multifractal analysis

    CERN Document Server

    Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula

    2016-01-01

    This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.

  19. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  20. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  1. System Sensitivity Analysis Applied to the Conceptual Design of a Dual-Fuel Rocket SSTO

    Science.gov (United States)

    Olds, John R.

    1994-01-01

    This paper reports the results of initial efforts to apply the System Sensitivity Analysis (SSA) optimization method to the conceptual design of a single-stage-to-orbit (SSTO) launch vehicle. SSA is an efficient, calculus-based MDO technique for generating sensitivity derivatives in a highly multidisciplinary design environment. The method has been successfully applied to conceptual aircraft design and has been proven to have advantages over traditional direct optimization methods. The method is applied to the optimization of an advanced, piloted SSTO design similar to vehicles currently being analyzed by NASA as possible replacements for the Space Shuttle. Powered by a derivative of the Russian RD-701 rocket engine, the vehicle employs a combination of hydrocarbon, hydrogen, and oxygen propellants. Three primary disciplines are included in the design - propulsion, performance, and weights & sizing. A complete, converged vehicle analysis depends on the use of three standalone conceptual analysis computer codes. Efforts to minimize vehicle dry (empty) weight are reported in this paper. The problem consists of six system-level design variables and one system-level constraint. Using SSA in a 'manual' fashion to generate gradient information, six system-level iterations were performed from each of two different starting points. The results showed a good pattern of convergence for both starting points. A discussion of the advantages and disadvantages of the method, possible areas of improvement, and future work is included.

  2. X-ray fluorescence spectrometry applied to soil analysis; Espectrometria de fluorescencia de raios X aplicada as analises de solo

    Energy Technology Data Exchange (ETDEWEB)

    Salvador, Vera Lucia Ribeiro; Sato, Ivone Mulako; Scapin Junior, Wilson Santo; Scapin, Marcos Antonio; Imakima, Kengo [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil). E-mail: vsalvado@baitaca.ipen.br; imsato@net.ipen.br; kengo@sup.ipen.br

    1997-07-01

    This paper studies the X-ray fluorescence spectrometry applied to the soil analysis. A comparative study of the WD-XRFS and ED-XRFS techniques was carried out by using the following soil samples: SL-1, SOIL-7 and marine sediment SD-M-2/TM, from IAEA, and clay, JG-1a from Geological Survey of Japan (GSJ)

  3. New techniques for emulsion analysis in a hybrid experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodama, K. (Aichi University of Education, Kariya 448 (Japan)); Ushida, N. (Aichi University of Education, Kariya 448 (Japan)); Mokhtarani, A. (University of California (Davis), Davis, CA 95616 (United States)); Paolone, V.S. (University of California (Davis), Davis, CA 95616 (United States)); Volk, J.T. (University of California (Davis), Davis, CA 95616 (United States)); Wilcox, J.O. (University of California (Davis), Davis, CA 95616 (United States)); Yager, P.M. (University of California (Davis), Davis, CA 95616 (United States)); Edelstein, R.M. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Freyberger, A.P. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Gibaut, D.B. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Lipton, R.J. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Nichols, W.R. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Potter, D.M. (Carnegie-Mellon Univers

    1994-08-01

    A new method, called graphic scanning, was developed by the Nagoya University Group for emulsion analysis in a hybrid experiment. This method enhances both speed and reliability of emulsion analysis. Details of the application of this technique to the analysis of Fermilab experiment E653 are described. ((orig.))

  4. Comparison Study of Different Lossy Compression Techniques Applied on Digital Mammogram Images

    Directory of Open Access Journals (Sweden)

    Ayman AbuBaker

    2016-12-01

    Full Text Available The huge growth of the usage of internet increases the need to transfer and save multimedia files. Mammogram images are part of these files that have large image size with high resolution. The compression of these images is used to reduce the size of the files without degrading the quality especially the suspicious regions in the mammogram images. Reduction of the size of these images gives more chance to store more images and minimize the cost of transmission in the case of exchanging information between radiologists. Many techniques exists in the literature to solve the loss of information in images. In this paper, two types of compression transformations are used which are Singular Value Decomposition (SVD that transforms the image into series of Eigen vectors that depends on the dimensions of the image and Discrete Cosine Transform (DCT that covert the image from spatial domain into frequency domain. In this paper, the Computer Aided Diagnosis (CAD system is implemented to evaluate the microcalcification appearance in mammogram images after using the two transformation compressions. The performance of both transformations SVD and DCT is subjectively compared by a radiologist. As a result, the DCT algorithm can effectively reduce the size of the mammogram images by 65% with high quality microcalcification appearance regions.

  5. Experimental studies of active and passive flow control techniques applied in a twin air-intake.

    Science.gov (United States)

    Paul, Akshoy Ranjan; Joshi, Shrey; Jindal, Aman; Maurya, Shivam P; Jain, Anuj

    2013-01-01

    The flow control in twin air-intakes is necessary to improve the performance characteristics, since the flow traveling through curved and diffused paths becomes complex, especially after merging. The paper presents a comparison between two well-known techniques of flow control: active and passive. It presents an effective design of a vortex generator jet (VGJ) and a vane-type passive vortex generator (VG) and uses them in twin air-intake duct in different combinations to establish their effectiveness in improving the performance characteristics. The VGJ is designed to insert flow from side wall at pitch angle of 90 degrees and 45 degrees. Corotating (parallel) and counterrotating (V-shape) are the configuration of vane type VG. It is observed that VGJ has the potential to change the flow pattern drastically as compared to vane-type VG. While the VGJ is directed perpendicular to the side walls of the air-intake at a pitch angle of 90 degree, static pressure recovery is increased by 7.8% and total pressure loss is reduced by 40.7%, which is the best among all other cases tested for VGJ. For bigger-sized VG attached to the side walls of the air-intake, static pressure recovery is increased by 5.3%, but total pressure loss is reduced by only 4.5% as compared to all other cases of VG.

  6. Modern Chemistry Techniques Applied to Metal Behavior and Chelation in Medical and Environmental Systems ? Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Sutton, M; Andresen, B; Burastero, S R; Chiarappa-Zucca, M L; Chinn, S C; Coronado, P R; Gash, A E; Perkins, J; Sawvel, A M; Szechenyi, S C

    2005-02-03

    This report details the research and findings generated over the course of a 3-year research project funded by Lawrence Livermore National Laboratory (LLNL) Laboratory Directed Research and Development (LDRD). Originally tasked with studying beryllium chemistry and chelation for the treatment of Chronic Beryllium Disease and environmental remediation of beryllium-contaminated environments, this work has yielded results in beryllium and uranium solubility and speciation associated with toxicology; specific and effective chelation agents for beryllium, capable of lowering beryllium tissue burden and increasing urinary excretion in mice, and dissolution of beryllium contamination at LLNL Site 300; {sup 9}Be NMR studies previously unstudied at LLNL; secondary ionization mass spec (SIMS) imaging of beryllium in spleen and lung tissue; beryllium interactions with aerogel/GAC material for environmental cleanup. The results show that chelator development using modern chemical techniques such as chemical thermodynamic modeling, was successful in identifying and utilizing tried and tested beryllium chelators for use in medical and environmental scenarios. Additionally, a study of uranium speciation in simulated biological fluids identified uranium species present in urine, gastric juice, pancreatic fluid, airway surface fluid, simulated lung fluid, bile, saliva, plasma, interstitial fluid and intracellular fluid.

  7. Machine Learning Techniques Applied to Sensor Data Correction in Building Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Matt K [ORNL; Castello, Charles C [ORNL; New, Joshua Ryan [ORNL

    2013-01-01

    Since commercial and residential buildings account for nearly half of the United States' energy consumption, making them more energy-efficient is a vital part of the nation's overall energy strategy. Sensors play an important role in this research by collecting data needed to analyze performance of components, systems, and whole-buildings. Given this reliance on sensors, ensuring that sensor data are valid is a crucial problem. Solutions being researched are machine learning techniques, namely: artificial neural networks and Bayesian Networks. Types of data investigated in this study are: (1) temperature; (2) humidity; (3) refrigerator energy consumption; (4) heat pump liquid pressure; and (5) water flow. These data are taken from Oak Ridge National Laboratory's (ORNL) ZEBRAlliance research project which is composed of four single-family homes in Oak Ridge, TN. Results show that for the temperature, humidity, pressure, and flow sensors, data can mostly be predicted with root-mean-square error (RMSE) of less than 10% of the respective sensor's mean value. Results for the energy sensor are not as good; RMSE are centered about 100% of the mean value and are often well above 200%. Bayesian networks have RSME of less than 5% of the respective sensor's mean value, but took substantially longer to train.

  8. Applying stereotactic injection technique to study genetic effects on animal behaviors.

    Science.gov (United States)

    McSweeney, Colleen; Mao, Yingwei

    2015-05-10

    Stereotactic injection is a useful technique to deliver high titer lentiviruses to targeted brain areas in mice. Lentiviruses can either overexpress or knockdown gene expression in a relatively focused region without significant damage to the brain tissue. After recovery, the injected mouse can be tested on various behavioral tasks such as the Open Field Test (OFT) and the Forced Swim Test (FST). The OFT is designed to assess locomotion and the anxious phenotype in mice by measuring the amount of time that a mouse spends in the center of a novel open field. A more anxious mouse will spend significantly less time in the center of the novel field compared to controls. The FST assesses the anti-depressive phenotype by quantifying the amount of time that mice spend immobile when placed into a bucket of water. A mouse with an anti-depressive phenotype will spend significantly less time immobile compared to control animals. The goal of this protocol is to use the stereotactic injection of a lentivirus in conjunction with behavioral tests to assess how genetic factors modulate animal behaviors.

  9. Experimental Studies of Active and Passive Flow Control Techniques Applied in a Twin Air-Intake

    Directory of Open Access Journals (Sweden)

    Akshoy Ranjan Paul

    2013-01-01

    Full Text Available The flow control in twin air-intakes is necessary to improve the performance characteristics, since the flow traveling through curved and diffused paths becomes complex, especially after merging. The paper presents a comparison between two well-known techniques of flow control: active and passive. It presents an effective design of a vortex generator jet (VGJ and a vane-type passive vortex generator (VG and uses them in twin air-intake duct in different combinations to establish their effectiveness in improving the performance characteristics. The VGJ is designed to insert flow from side wall at pitch angle of 90 degrees and 45 degrees. Corotating (parallel and counterrotating (V-shape are the configuration of vane type VG. It is observed that VGJ has the potential to change the flow pattern drastically as compared to vane-type VG. While the VGJ is directed perpendicular to the side walls of the air-intake at a pitch angle of 90 degree, static pressure recovery is increased by 7.8% and total pressure loss is reduced by 40.7%, which is the best among all other cases tested for VGJ. For bigger-sized VG attached to the side walls of the air-intake, static pressure recovery is increased by 5.3%, but total pressure loss is reduced by only 4.5% as compared to all other cases of VG.

  10. Blade Displacement Measurement Technique Applied to a Full-Scale Rotor Test

    Science.gov (United States)

    Abrego, Anita I.; Olson, Lawrence E.; Romander, Ethan A.; Barrows, Danny A.; Burner, Alpheus W.

    2012-01-01

    Blade displacement measurements using multi-camera photogrammetry were acquired during the full-scale wind tunnel test of the UH-60A Airloads rotor, conducted in the National Full-Scale Aerodynamics Complex 40- by 80-Foot Wind Tunnel. The objectives were to measure the blade displacement and deformation of the four rotor blades as they rotated through the entire rotor azimuth. These measurements are expected to provide a unique dataset to aid in the development and validation of rotorcraft prediction techniques. They are used to resolve the blade shape and position, including pitch, flap, lag and elastic deformation. Photogrammetric data encompass advance ratios from 0.15 to slowed rotor simulations of 1.0, thrust coefficient to rotor solidity ratios from 0.01 to 0.13, and rotor shaft angles from -10.0 to 8.0 degrees. An overview of the blade displacement measurement methodology and system development, descriptions of image processing, uncertainty considerations, preliminary results covering static and moderate advance ratio test conditions and future considerations are presented. Comparisons of experimental and computational results for a moderate advance ratio forward flight condition show good trend agreements, but also indicate significant mean discrepancies in lag and elastic twist. Blade displacement pitch measurements agree well with both the wind tunnel commanded and measured values.

  11. Applying satellite remote sensing technique in disastrous rainfall systems around Taiwan

    Science.gov (United States)

    Liu, Gin-Rong; Chen, Kwan-Ru; Kuo, Tsung-Hua; Liu, Chian-Yi; Lin, Tang-Huang; Chen, Liang-De

    2016-05-01

    Many people in Asia regions have been suffering from disastrous rainfalls year by year. The rainfall from typhoons or tropical cyclones (TCs) is one of their key water supply sources, but from another perspective such TCs may also bring forth unexpected heavy rainfall, thereby causing flash floods, mudslides or other disasters. So far we cannot stop or change a TC route or intensity via present techniques. Instead, however we could significantly mitigate the possible heavy casualties and economic losses if we can earlier know a TC's formation and can estimate its rainfall amount and distribution more accurate before its landfalling. In light of these problems, this short article presents methods to detect a TC's formation as earlier and to delineate its rainfall potential pattern more accurate in advance. For this first part, the satellite-retrieved air-sea parameters are obtained and used to estimate the thermal and dynamic energy fields and variation over open oceans to delineate the high-possibility typhoon occurring ocean areas and cloud clusters. For the second part, an improved tropical rainfall potential (TRaP) model is proposed with better assumptions then the original TRaP for TC rainfall band rotations, rainfall amount estimation, and topographic effect correction, to obtain more accurate TC rainfall distributions, especially for hilly and mountainous areas, such as Taiwan.

  12. Dosimetric properties of bio minerals applied to high-dose dosimetry using the TSEE technique

    Energy Technology Data Exchange (ETDEWEB)

    Vila, G. B.; Caldas, L. V. E., E-mail: gbvila@ipen.br [Instituto de Pesquisas Energeticas e Nucleares / CNEN, Av. Lineu Prestes 2242, Cidade Universitaria, 05508-000 Sao Paulo (Brazil)

    2014-08-15

    The study of the dosimetric properties such as reproducibility, the residual signal, lower detection dose, dose-response curve and fading of the thermally stimulated emission exo electronic (TSEE) signal of Brazilian bio minerals has shown that these materials present a potential use as radiation dosimeters. The reproducibility within ± 10% for oyster shell, mother-of-pearl and coral reef samples showed that the signal dispersion is small when compared with the mean value of the measurements. The study showed that the residual signal can be eliminated with a thermal treatment at 300 grades C/1 h. The lower detection dose of 9.8 Gy determined for the oyster shell samples when exposed to beta radiation and 1.6 Gy for oyster shell and mother-of-pearl samples when exposed to gamma radiation can be considered good, taking into account the high doses of this study. The materials presented linearity at the dose response curves in some ranges, but the lack of linearity in other cases presents no problem since a good mathematical description is possible. The fading study showed that the loss of TSEE signal can be minimized if the samples are protected from interferences such as light, heat and humidity. Taking into account the useful linearity range as the main dosimetric characteristic, the tiger shell and oyster shell samples are the most suitable for high-dose dosimetry using the TSEE technique. (Author)

  13. From birds to bees: applying video observation techniques to invertebrate pollinators

    Directory of Open Access Journals (Sweden)

    C J Lortie

    2012-01-01

    Full Text Available Observation is a critical element of behavioural ecology and ethology. Here, we propose a similar set of techniques to enhance the study of the diversity patterns of invertebrate pollinators and associated plant species. In a body of avian research, cameras are set up on nests in blinds to examine chick and parent interactions. This avoids observer bias, minimizes interference, and provides numerous other benefits including timestamps, the capacity to record frequency and duration of activities, and provides a permanent archive of activity for later analyses. Hence, we propose that small video cameras in blinds can also be used to continuously monitor pollinator activity on plants thereby capitalizing on those same benefits. This method was proofed in 2010 in the alpine in BC, Canada on target focal plant species and on open mixed assemblages of plant species. Apple ipod nanos successfully recorded activity for an entire day at a time totalling 450 hours and provided sufficient resolution and field of view to both identify pollinators to recognizable taxonomic units and monitor movement and visitation rates at a scale of view of approximately 50 cm2. This method is not a replacement for pan traps or sweep nets but an opportunity to enhance these datasets with more detailed, finer-resolution data. Importantly, the test of this specific method also indicates that far more hours of observation - using any method - are likely required than most current ecological studies published to accurately estimate pollinator diversity.

  14. A non-intrusive measurement technique applying CARS for concentration measurement in a gas mixing flow

    CERN Document Server

    Yamamoto, Ken; Moriya, Madoka; Kuriyama, Reiko; Sato, Yohei

    2015-01-01

    Coherent anti-Stokes Raman scattering (CARS) microscope system was built and applied to a non-intrusive gas concentration measurement of a mixing flow in a millimeter-scale channel. Carbon dioxide and nitrogen were chosen as test fluids and CARS signals from the fluids were generated by adjusting the wavelengths of the Pump and the Stokes beams. The generated CARS signals, whose wavelengths are different from those of the Pump and the Stokes beams, were captured by an EM-CCD camera after filtering out the excitation beams. A calibration experiment was performed in order to confirm the applicability of the built-up CARS system by measuring the intensity of the CARS signal from known concentrations of the samples. After confirming that the measured CARS intensity was proportional to the second power of the concentrations as was theoretically predicted, the CARS intensities in the gas mixing flow channel were measured. Ten different measurement points were set and concentrations of both carbon dioxide and nitrog...

  15. Muscle stiffness estimation using a system identification technique applied to evoked mechanomyogram during cycling exercise.

    Science.gov (United States)

    Uchiyama, Takanori; Saito, Kaito; Shinjo, Katsuya

    2015-12-01

    The aims of this study were to develop a method to extract the evoked mechanomyogram (MMG) during cycling exercise and to clarify muscle stiffness at various cadences, workloads, and power. Ten young healthy male participants were instructed to pedal a cycle ergometer at cadences of 40 and 60 rpm. The loads were 4.9, 9.8, 14.7, and 19.6 N, respectively. One electrical stimulus per two pedal rotations was applied to the vastus lateralis muscle at a knee angle of 80° in the down phase. MMGs were measured using a capacitor microphone, and the MMGs were divided into stimulated and non-stimulated sequences. Each sequence was synchronously averaged. The synchronously averaged non-stimulated MMG was subtracted from the synchronously averaged stimulated MMG to extract an evoked MMG. The evoked MMG system was identified and the poles of the transfer function were calculated. The poles and mass of the vastus lateralis muscle were used to estimate muscle stiffness. Results showed that muscle stiffness was 186-626 N /m and proportional to the workloads and power. In conclusion, our method can be used to assess muscle stiffness proportional to the workload and power.

  16. Can Artificial Neural Networks be Applied in Seismic Predicition? Preliminary Analysis Applying Radial Topology. Case: Mexico

    CERN Document Server

    Mota-Hernandez, Cinthya; Alvarado-Corona, Rafael

    2014-01-01

    Tectonic earthquakes of high magnitude can cause considerable losses in terms of human lives, economic and infrastructure, among others. According to an evaluation published by the U.S. Geological Survey, 30 is the number of earthquakes which have greatly impacted Mexico from the end of the XIX century to this one. Based upon data from the National Seismological Service, on the period between January 1, 2006 and May 1, 2013 there have occurred 5,826 earthquakes which magnitude has been greater than 4.0 degrees on the Richter magnitude scale (25.54% of the total of earthquakes registered on the national territory), being the Pacific Plate and the Cocos Plate the most important ones. This document describes the development of an Artificial Neural Network (ANN) based on the radial topology which seeks to generate a prediction with an error margin lower than 20% which can inform about the probability of a future earthquake one of the main questions is: can artificial neural networks be applied in seismic forecast...

  17. A synchronized particle image velocimetry and infrared thermography technique applied to an acoustic streaming flow

    Energy Technology Data Exchange (ETDEWEB)

    Sou, In Mei; Ray, Chittaranjan [University of Hawaii at Manoa, Department of Civil and Environmental Engineering, Honolulu, HI (United States); Allen, John S.; Layman, Christopher N. [University of Hawaii at Manoa, Department of Mechanical Engineering, Honolulu, HI (United States)

    2011-11-15

    Subsurface coherent structures and surface temperatures are investigated using simultaneous measurements of particle image velocimetry (PIV) and infrared (IR) thermography. Results for coherent structures from acoustic streaming and associated heating transfer in a rectangular tank with an acoustic horn mounted horizontally at the sidewall are presented. An observed vortex pair develops and propagates in the direction along the centerline of the horn. From the PIV velocity field data, distinct kinematic regions are found with the Lagrangian coherent structure (LCS) method. The implications of this analysis with respect to heat transfer and related sonochemical applications are discussed. (orig.)

  18. Shopping For Danger: E-commerce techniques applied to collaboration in cyber security

    Energy Technology Data Exchange (ETDEWEB)

    Bruce, Joseph R.; Fink, Glenn A.

    2012-05-24

    Collaboration among cyber security analysts is essential to a successful protection strategy on the Internet today, but it is uncommonly practiced or encouraged in operating environments. Barriers to productive collaboration often include data sensitivity, time and effort to communicate, institutional policy, and protection of domain knowledge. We propose an ambient collaboration framework, Vulcan, designed to remove the barriers of time and effort and mitigate the others. Vulcan automated data collection, collaborative filtering, and asynchronous dissemination, eliminating the effort implied by explicit collaboration among peers. We instrumented two analytic applications and performed a mock analysis session to build a dataset and test the output of the system.

  19. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  20. An evaluation of directional analysis techniques for multidirectional, partially reflected waves .1. numerical investigations

    DEFF Research Database (Denmark)

    Ilic, C; Chadwick, A; Helm-Petersen, Jacob

    2000-01-01

    Recent studies of advanced directional analysis techniques have mainly centred on incident wave fields. In the study of coastal structures, however, partially reflective wave fields are commonly present. In the near structure field, phase locked methods can be successfully applied. In the far fie...

  1. [THE COMPARATIVE ANALYSIS OF TECHNIQUES OF IDENTIFICATION OF CORYNEBACTERIUM NON DIPHTHERIAE].

    Science.gov (United States)

    Kharseeva, G G; Voronina, N A; Mironov, A Yu; Alutina, E L

    2015-12-01

    The comparative analysis was carried out concerning effectiveness of three techniques of identification of Corynebacterium non diphtheriae: bacteriological, molecular genetic (sequenation on 16SpRNA) andmass-spectrometric (MALDI-ToFMS). The analysis covered 49 strains of Corynebacterium non diphtheriae (C.pseudodiphheriticum, C.amycolatum, C.propinquum, C.falsenii) and 2 strains of Corynebacterium diphtheriae isolated under various pathology form urogenital tract and upper respiratory ways. The corinbacteria were identified using bacteriologic technique, sequenation on 16SpRNA and mass-spectrometric technique (MALDIToF MS). The full concordance of results of species' identification was marked in 26 (51%) of strains of Corynebacterium non diphtheriae at using three analysis techniques; in 43 (84.3%) strains--at comparison of bacteriologic technique with sequenation on 16S pRNA and in 29 (57%)--at mass-spectrometric analysis and sequenation on 16S pRNA. The bacteriologic technique is effective for identification of Corynebacterium diphtheriae. The precise establishment of species belonging of corynebacteria with variable biochemical characteristics the molecular genetic technique of analysis is to be applied. The mass-spectrometric technique (MALDI-ToF MS) requires further renewal of data bases for identifying larger spectrum of representatives of genus Corynebacterium.

  2. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    Science.gov (United States)

    Abtahi, Amir-Reza; Bijari, Afsane

    2016-09-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  3. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  4. APPLIED PHYTO-REMEDIATION TECHNIQUES USING HALOPHYTES FOR OIL AND BRINE SPILL SCARS

    Energy Technology Data Exchange (ETDEWEB)

    M.L. Korphage; Bruce G. Langhus; Scott Campbell

    2003-03-01

    Produced salt water from historical oil and gas production was often managed with inadequate care and unfortunate consequences. In Kansas, the production practices in the 1930's and 1940's--before statewide anti-pollution laws--were such that fluids were often produced to surface impoundments where the oil would segregate from the salt water. The oil was pumped off the pits and the salt water was able to infiltrate into the subsurface soil zones and underlying bedrock. Over the years, oil producing practices were changed so that segregation of fluids was accomplished in steel tanks and salt water was isolated from the natural environment. But before that could happen, significant areas of the state were scarred by salt water. These areas are now in need of economical remediation. Remediation of salt scarred land can be facilitated with soil amendments, land management, and selection of appropriate salt tolerant plants. Current research on the salt scars around the old Leon Waterflood, in Butler County, Kansas show the relative efficiency of remediation options. Based upon these research findings, it is possible to recommend cost efficient remediation techniques for slight, medium, and heavy salt water damaged soil. Slight salt damage includes soils with Electrical Conductivity (EC) values of 4.0 mS/cm or less. Operators can treat these soils with sufficient amounts of gypsum, install irrigation systems, and till the soil. Appropriate plants can be introduced via transplants or seeded. Medium salt damage includes soils with EC values between 4.0 and 16 mS/cm. Operators will add amendments of gypsum, till the soil, and arrange for irrigation. Some particularly salt tolerant plants can be added but most planting ought to be reserved until the second season of remediation. Severe salt damage includes soil with EC values in excess of 16 mS/cm. Operators will add at least part of the gypsum required, till the soil, and arrange for irrigation. The following

  5. Statistical Mechanics Ideas and Techniques Applied to Selected Problems in Ecology

    Directory of Open Access Journals (Sweden)

    Hugo Fort

    2013-11-01

    Full Text Available Ecosystem dynamics provides an interesting arena for the application of a plethora concepts and techniques from statistical mechanics. Here I review three examples corresponding each one to an important problem in ecology. First, I start with an analytical derivation of clumpy patterns for species relative abundances (SRA empirically observed in several ecological communities involving a high number n of species, a phenomenon which have puzzled ecologists for decades. An interesting point is that this derivation uses results obtained from a statistical mechanics model for ferromagnets. Second, going beyond the mean field approximation, I study the spatial version of a popular ecological model involving just one species representing vegetation. The goal is to address the phenomena of catastrophic shifts—gradual cumulative variations in some control parameter that suddenly lead to an abrupt change in the system—illustrating it by means of the process of desertification of arid lands. The focus is on the aggregation processes and the effects of diffusion that combined lead to the formation of non trivial spatial vegetation patterns. It is shown that different quantities—like the variance, the two-point correlation function and the patchiness—may serve as early warnings for the desertification of arid lands. Remarkably, in the onset of a desertification transition the distribution of vegetation patches exhibits scale invariance typical of many physical systems in the vicinity a phase transition. I comment on similarities of and differences between these catastrophic shifts and paradigmatic thermodynamic phase transitions like the liquid-vapor change of state for a fluid. Third, I analyze the case of many species interacting in space. I choose tropical forests, which are mega-diverse ecosystems that exhibit remarkable dynamics. Therefore these ecosystems represent a research paradigm both for studies of complex systems dynamics as well as to

  6. Lipase immobilized by different techniques on various support materials applied in oil hydrolysis

    Directory of Open Access Journals (Sweden)

    VILMA MINOVSKA

    2005-04-01

    Full Text Available Batch hydrolysis of olive oil was performed by Candida rugosa lipase immobilized on Amberlite IRC-50 and Al2O3. These two supports were selected out of 16 carriers: inorganic materials (sand, silica gel, infusorial earth, Al2O3, inorganic salts (CaCO3, CaSO4, ion-exchange resins (Amberlite IRC-50 and IR-4B, Dowex 2X8, a natural resin (colophony, a natural biopolymer (sodium alginate, synthetic polymers (polypropylene, polyethylene and zeolites. Lipase immobilization was carried out by simple adsorption, adsorption followed by cross-linking, adsorption on ion-exchange resins, combined adsorption and precipitation, pure precipitation and gel entrapment. The suitability of the supports and techniques for the immobilization of lipase was evaluated by estimating the enzyme activity, protein loading, immobilization efficiency and reusability of the immobilizates. Most of the immobilizates exhibited either a low enzyme activity or difficulties during the hydrolytic reaction. Only those prepared by ionic adsorption on Amberlite IRC-50 and by combined adsorption and precipitation on Al2O3 showed better activity, 2000 and 430 U/g support, respectively, and demonstrated satisfactory behavior when used repeatedly. The hydrolysis was studied as a function of several parameters: surfactant concentration, enzyme concentration, pH and temperature. The immobilized preparation with Amberlite IRC-50 was stable and active in the whole range of pH (4 to 9 and temperature (20 to 50 °C, demonstrating a 99% degree of hydrolysis. In repeated usage, it was stable and active having a half-life of 16 batches, which corresponds to an operation time of 384 h. Its storage stability was remarkable too, since after 9 months it had lost only 25 % of the initial activity. The immobilizate with Al22O3 was less stable and less active. At optimal environmental conditions, the degree of hydrolysis did not exceed 79 %. In repeated usage, after the fourth batch, the degree of

  7. Coding technique with progressive reconstruction based on VQ and entropy coding applied to medical images

    Science.gov (United States)

    Martin-Fernandez, Marcos; Alberola-Lopez, Carlos; Guerrero-Rodriguez, David; Ruiz-Alzola, Juan

    2000-12-01

    In this paper we propose a novel lossless coding scheme for medical images that allows the final user to switch between a lossy and a lossless mode. This is done by means of a progressive reconstruction philosophy (which can be interrupted at will) so we believe that our scheme gives a way to trade off between the accuracy needed for medical diagnosis and the information reduction needed for storage and transmission. We combine vector quantization, run-length bit plane and entropy coding. Specifically, the first step is a vector quantization procedure; the centroid codes are Huffman- coded making use of a set of probabilities that are calculated in the learning phase. The image is reconstructed at the coder in order to obtain the error image; this second image is divided in bit planes, which are then run-length and Huffman coded. A second statistical analysis is performed during the learning phase to obtain the parameters needed in this final stage. Our coder is currently trained for hand-radiographs and fetal echographies. We compare our results for this two types of images to classical results on bit plane coding and the JPEG standard. Our coder turns out to outperform both of them.

  8. Moving beyond regression techniques in cardiovascular risk prediction: applying machine learning to address analytic challenges.

    Science.gov (United States)

    Goldstein, Benjamin A; Navar, Ann Marie; Carter, Rickey E

    2016-07-19

    Risk prediction plays an important role in clinical cardiology research. Traditionally, most risk models have been based on regression models. While useful and robust, these statistical methods are limited to using a small number of predictors which operate in the same way on everyone, and uniformly throughout their range. The purpose of this review is to illustrate the use of machine-learning methods for development of risk prediction models. Typically presented as black box approaches, most machine-learning methods are aimed at solving particular challenges that arise in data analysis that are not well addressed by typical regression approaches. To illustrate these challenges, as well as how different methods can address them, we consider trying to predicting mortality after diagnosis of acute myocardial infarction. We use data derived from our institution's electronic health record and abstract data on 13 regularly measured laboratory markers. We walk through different challenges that arise in modelling these data and then introduce different machine-learning approaches. Finally, we discuss general issues in the application of machine-learning methods including tuning parameters, loss functions, variable importance, and missing data. Overall, this review serves as an introduction for those working on risk modelling to approach the diffuse field of machine learning.

  9. Multivariate Cross-Classification: Applying machine learning techniques to characterize abstraction in neural representations

    Directory of Open Access Journals (Sweden)

    Jonas eKaplan

    2015-03-01

    Full Text Available Here we highlight an emerging trend in the use of machine learning classifiers to test for abstraction across patterns of neural activity. When a classifier algorithm is trained on data from one cognitive context, and tested on data from another, conclusions can be drawn about the role of a given brain region in representing information that abstracts across those cognitive contexts. We call this kind of analysis Multivariate Cross-Classification (MVCC, and review several domains where it has recently made an impact. MVCC has been important in establishing correspondences among neural patterns across cognitive domains, including motor-perception matching and cross-sensory matching. It has been used to test for similarity between neural patterns evoked by perception and those generated from memory. Other work has used MVCC to investigate the similarity of representations for semantic categories across different kinds of stimulus presentation, and in the presence of different cognitive demands. We use these examples to demonstrate the power of MVCC as a tool for investigating neural abstraction and discuss some important methodological issues related to its application.

  10. Maximizing setup accuracy using portal images as applied to a conformal boost technique for prostatic cancer

    Energy Technology Data Exchange (ETDEWEB)

    Bijhold, J.; Lebesque, J.V.; Hart, A.A.M.; Vijlbrief, R.E. (Nederlands Kanker Inst. ' Antoni van Leeuwenhoekhuis' , Amsterdam (Netherlands))

    1992-08-01

    A design procedure of a patients setup verification protocol based upon frequent digital acquisition of portal images is demonstrated with an application for conformal prostatic boost fields. The protocol aims at the elimination of large systematic deviations in the patient setup and includes decision rules which indicate when correction of the patient setup is needed. The decision rules were derived from the results of a theoretical and quantitative analysis of patient setup variations measured in three pelvic fields (one anterior-posterior and two lateral fields) of 105 fractions for nine patients. Deviations in the patients positioning, derived from one field, were quantified as two-dimensional (2-D) displacement vectors in the plane perpendicular to the beam axis by alignment of anatomical features in the portal and the simulator image. The magnitude of the overall setup variations along the anterior-posterior, superior-inferior and lateral directions varied between 2.6 and 3 mm (1 S.D). In addition, intra-treatment variations appeared to be predictable which was a prerequisite for the development of the decision rules. The 2-D setup deviations, measured in three fields of one fraction were strongly correlated and a 3-D displacement vector was calculated. Utilization of this 3-D vector in a setup verification protocol may lead to an early detection of systematic setup deviations. (author). 19 refs., 5 figs., 3 tabs.

  11. Fielding the magnetically applied pressure-shear technique on the Z accelerator (completion report for MRT 4519).

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, C. Scott; Haill, Thomas A.; Dalton, Devon Gardner; Rovang, Dean Curtis; Lamppa, Derek C.

    2013-09-01

    The recently developed Magnetically Applied Pressure-Shear (MAPS) experimental technique to measure material shear strength at high pressures on magneto-hydrodynamic (MHD) drive pulsed power platforms was fielded on August 16, 2013 on shot Z2544 utilizing hardware set A0283A. Several technical and engineering challenges were overcome in the process leading to the attempt to measure the dynamic strength of NNSA Ta at 50 GPa. The MAPS technique relies on the ability to apply an external magnetic field properly aligned and time correlated with the MHD pulse. The load design had to be modified to accommodate the external field coils and additional support was required to manage stresses from the pulsed magnets. Further, this represents the first time transverse velocity interferometry has been applied to diagnose a shot at Z. All subsystems performed well with only minor issues related to the new feed design which can be easily addressed by modifying the current pulse shape. Despite the success of each new component, the experiment failed to measure strength in the samples due to spallation failure, most likely in the diamond anvils. To address this issue, hydrocode simulations are being used to evaluate a modified design using LiF windows to minimize tension in the diamond and prevent spall. Another option to eliminate the diamond material from the experiment is also being investigated.

  12. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  13. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  14. An Analysis of the Economy Principle Applied in Cyber Language

    Institute of Scientific and Technical Information of China (English)

    肖钰敏

    2015-01-01

    With the development of network technology,cyber language,a new social dialect,is widely used in our life.The author analyzes how the economy principle is applied in cyber language from three aspects—word-formation,syntax and non-linguistic symbol.And the author collects,summarizes and analyzes the relevant language materials to prove the economy principle’s real existence in chat room and the reason why the economy principle is applied widely in cyber space.

  15. A Review of Temporal Aspects of Hand Gesture Analysis Applied to Discourse Analysis and Natural Conversation

    Directory of Open Access Journals (Sweden)

    Renata C. B. Madeo

    2013-08-01

    Full Text Available Lately, there has been an increasinginterest in hand gesture analysis systems. Recent works have employedpattern recognition techniques and have focused on the development of systems with more natural userinterfaces. These systems may use gestures to control interfaces or recognize sign language gestures, whichcan provide systems with multimodal interaction; or consist in multimodal tools to help psycholinguists tounderstand new aspects of discourse analysis and to automate laborious tasks.Gestures are characterizedby several aspects, mainly by movementsand sequence of postures. Since data referring to movementsorsequencescarry temporal information, this paper presents aliteraturereviewabouttemporal aspects ofhand gesture analysis, focusing on applications related to natural conversation and psycholinguisticanalysis, using Systematic Literature Review methodology. In our results, we organized works according totype of analysis, methods, highlighting the use of Machine Learning techniques, and applications.

  16. An Appraisal of Social Network Theory and Analysis as Applied to Public Health: Challenges and Opportunities.

    Science.gov (United States)

    Valente, Thomas W; Pitts, Stephanie R

    2017-03-20

    The use of social network theory and analysis methods as applied to public health has expanded greatly in the past decade, yielding a significant academic literature that spans almost every conceivable health issue. This review identifies several important theoretical challenges that confront the field but also provides opportunities for new research. These challenges include (a) measuring network influences, (b) identifying appropriate influence mechanisms, (c) the impact of social media and computerized communications, (d) the role of networks in evaluating public health interventions, and (e) ethics. Next steps for the field are outlined and the need for funding is emphasized. Recently developed network analysis techniques, technological innovations in communication, and changes in theoretical perspectives to include a focus on social and environmental behavioral influences have created opportunities for new theory and ever broader application of social networks to public health topics.

  17. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  18. An applied general equilibrium model for Dutch agribusiness policy analysis.

    NARCIS (Netherlands)

    Peerlings, J.H.M.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of interest.The model is fairly

  19. Applying aerial digital photography as a spectral remote sensing technique for macrophytic cover assessment in small rural streams

    Science.gov (United States)

    Anker, Y.; Hershkovitz, Y.; Gasith, A.; Ben-Dor, E.

    2011-12-01

    Although remote sensing of fluvial ecosystems is well developed, the tradeoff between spectral and spatial resolutions prevents its application in small streams (cognitive color) and high spatial resolution of aerial photography provides noise filtration and better sub-water detection capabilities than the HSR technique. C. Only the SRGB method applies for habitat and section scales; hence, its application together with in-situ grid transects for validation, may be optimal for use in similar scenarios. The HSR dataset was first degraded to 17 bands with the same spectral range as the RGB dataset and also to a dataset with 3 equivalent bands

  20. System Analysis Applying to Talent Resource Development Research

    Institute of Scientific and Technical Information of China (English)

    WANG Peng-tao; ZHENG Gang

    2001-01-01

    In the development research of talent resource, the most important of talent resource forecast and optimization is the structure of talent resource, requirement number and talent quality. The article establish factor reconstruction analysis forecast and talent quality model on the method: system reconstruction analysis and ensure most effective factor level in system, which is presented by G. J. Klirti, B.Jonesque. And performing dynamic analysis of example ration.

  1. 床旁血液灌流技术在救治100例百草枯中毒患者中的应用分析%Analysis of the Effect of Bedside Blood Perfusion Technique Applied to the Treatment of 100 Cases with Paraquat Poisoning

    Institute of Scientific and Technical Information of China (English)

    施开泰; 程贤军; 杨亚萍; 罗娟; 闵毅; 马思思

    2015-01-01

    Objective To analyze the value of bedside blood perfusion technique applied to the treatment of patients with paraquat poisoning. Methods 100 patients with paraquat poisoning(common cases) treated in our hospital from August 2013 to March 2015 were selected and treated by bedside blood perfusion technique. The clinical data of patients were reviewed. Results Of the 100 patients with paraquat poisoning, the treatment effect was markedly effective in 72%of the patients, and ineffective and deaths in 28%of the patients. 2h after blood perfusion, the concentration of paraquat in the urine was signif-icantly lower than that before the blood perfusion with significant difference (P0.05). Conclusion For patients with paraquat poisoning, bedside blood perfusion technique can effectively remove the paraquat in the body, relieve the clinical symptoms, reduce the toxic damage to the body function and improve the success rate of rescue.%目的:探析床旁血液灌流技术在救治百草枯中毒患者中的应用价值。方法整群收集该院于2013年8月—2015年3月治疗的100例百草枯中毒患者,均非罕见病例,使用床旁血液灌流技术进行治疗,回顾分析患者的临床资料。结果100例百草枯中毒患者显效占72%,无效、死亡占28%。患者血液灌流2 h后,尿液百草枯浓度检测结果明显低于血液灌流前,差异有统计学意义(P<0.05)。而灌流4 h后,尿液百草枯浓度检测结果与灌流2 h的检测结果差异无统计学意义(P>0.05)。结论百草枯中毒患者使用床旁血液灌流技术进行治疗,可以有效清除体内的百草枯,缓解患者的临床症状,减轻毒物对患者机体机能的损害,提高抢救成功率。

  2. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  3. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  4. 分子生物学技术在瘤胃厌氧真菌分类多样性与定量分析研究中的应用%Modern Molecular Techniques Applied in Microbial Diversity and Quantitative Analysis of Anaerobic Fungi in the Rumen: A Review

    Institute of Scientific and Technical Information of China (English)

    沈博通; 曹阳春; 杨红建

    2011-01-01

    Anaerobic fungi play a significant role in promoting the degradation of fibrous feed in the rumen. Research advances in last decades were reviewed in biological taxonomy, classification and life circle of rumen fungi. The rapid development and application of molecular biology methods, including rDNA sequence analysis, restriction fragment length polymorphisms (RFLP) and automated ribosomal intergenic spacer analysis (ARISA), have brought a new pathway to update the study of rumen fungi diversity. Due to high sensitivity and specificity, one or several efficient techniques have been integrated in the taxonomical classification and molecular phylogenetic analysis for rumen fungi, and quantitative analysis of fungi biomass by a real-time polymerase chain reaction (PCR) technique has been successfully used to monitor anaerobic fungi in rumen micro-ecology circumstance.%栖居在瘤胃中的厌氧真菌对饲料纤维的降解具有重要的促进作用.围绕瘤胃厌氧真菌的生物学分类以及生活史,对国内外研究进展进行了综述分析.核糖体DNA的序列分析、限制性片段长度多态性(RFLP)、核糖体间隔基因自动分析(ARISA)等分子生物学技术的快速发展与应用为厌氧真菌分类多样性研究开辟了新的途径.这些快速、易于操作的新技术,因其所具有高度灵敏性和特异性,而被广泛用于对瘤胃厌氧真菌的分类鉴定和系统进化分析中,与此同时,实时定量PCR技术目前已经被广泛接受并用来定量监测瘤胃微生态环境下的真菌生物量.

  5. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  6. Different spectrophotometric methods applied for the analysis of binary mixture of flucloxacillin and amoxicillin: A comparative study

    Science.gov (United States)

    Attia, Khalid A. M.; Nassar, Mohammed W. I.; El-Zeiny, Mohamed B.; Serag, Ahmed

    2016-05-01

    Three different spectrophotometric methods were applied for the quantitative analysis of flucloxacillin and amoxicillin in their binary mixture, namely, ratio subtraction, absorbance subtraction and amplitude modulation. A comparative study was done listing the advantages and the disadvantages of each method. All the methods were validated according to the ICH guidelines and the obtained accuracy, precision and repeatability were found to be within the acceptable limits. The selectivity of the proposed methods was tested using laboratory prepared mixtures and assessed by applying the standard addition technique. So, they can be used for the routine analysis of flucloxacillin and amoxicillin in their binary mixtures.

  7. Analysis of OFDM Applied to Powerline High Speed Digital Communication

    Institute of Scientific and Technical Information of China (English)

    ZHUANG Jian; YANG Gong-xu

    2003-01-01

    The low voltage powerline is becoming a powerful solution to home network, building automation, and internet access as a result of its wide distribution, easy access and little maintenance. The character of powerline channel is very complicated because it is an open net. This article analysed the character of the powerline channel,introduced the basics of OFDM(Orthogonal Frequency Division Multiplexing), and studied the OFDM applied into powerline high speed digital communication.

  8. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  9. Applying a nonlinear, pitch-catch, ultrasonic technique for the detection of kissing bonds in friction stir welds.

    Science.gov (United States)

    Delrue, Steven; Tabatabaeipour, Morteza; Hettler, Jan; Van Den Abeele, Koen

    2016-05-01

    Friction stir welding (FSW) is a promising technology for the joining of aluminum alloys and other metallic admixtures that are hard to weld by conventional fusion welding. Although FSW generally provides better fatigue properties than traditional fusion welding methods, fatigue properties are still significantly lower than for the base material. Apart from voids, kissing bonds for instance, in the form of closed cracks propagating along the interface of the stirred and heat affected zone, are inherent features of the weld and can be considered as one of the main causes of a reduced fatigue life of FSW in comparison to the base material. The main problem with kissing bond defects in FSW, is that they currently are very difficult to detect using existing NDT methods. Besides, in most cases, the defects are not directly accessible from the exposed surface. Therefore, new techniques capable of detecting small kissing bond flaws need to be introduced. In the present paper, a novel and practical approach is introduced based on a nonlinear, single-sided, ultrasonic technique. The proposed inspection technique uses two single element transducers, with the first transducer transmitting an ultrasonic signal that focuses the ultrasonic waves at the bottom side of the sample where cracks are most likely to occur. The large amount of energy at the focus activates the kissing bond, resulting in the generation of nonlinear features in the wave propagation. These nonlinear features are then captured by the second transducer operating in pitch-catch mode, and are analyzed, using pulse inversion, to reveal the presence of a defect. The performance of the proposed nonlinear, pitch-catch technique, is first illustrated using a numerical study of an aluminum sample containing simple, vertically oriented, incipient cracks. Later, the proposed technique is also applied experimentally on a real-life friction stir welded butt joint containing a kissing bond flaw.

  10. Design, data analysis and sampling techniques for clinical research

    OpenAIRE

    Karthik Suresh; Sanjeev V Thomas; Geetha Suresh

    2011-01-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains...

  11. Some aspects of analytical chemistry as applied to water quality assurance techniques for reclaimed water: The potential use of X-ray fluorescence spectrometry for automated on-line fast real-time simultaneous multi-component analysis of inorganic pollutants in reclaimed water

    Science.gov (United States)

    Ling, A. C.; Macpherson, L. H.; Rey, M.

    1981-01-01

    The potential use of isotopically excited energy dispersive X-ray fluorescence (XRF) spectrometry for automated on line fast real time (5 to 15 minutes) simultaneous multicomponent (up to 20) trace (1 to 10 parts per billion) analysis of inorganic pollutants in reclaimed water was examined. Three anionic elements (chromium 6, arsenic and selenium) were studied. The inherent lack of sensitivity of XRF spectrometry for these elements mandates use of a preconcentration technique and various methods were examined, including: several direct and indirect evaporation methods; ion exchange membranes; selective and nonselective precipitation; and complexation processes. It is shown tha XRF spectrometry itself is well suited for automated on line quality assurance, and can provide a nondestructive (and thus sample storage and repeat analysis capabilities) and particularly convenient analytical method. Further, the use of an isotopically excited energy dispersive unit (50 mCi Cd-109 source) coupled with a suitable preconcentration process can provide sufficient sensitivity to achieve the current mandated minimum levels of detection without the need for high power X-ray generating tubes.

  12. Factorial kriging analysis applied to geological data from petroleum exploration

    Energy Technology Data Exchange (ETDEWEB)

    Jaquet, O.

    1989-10-01

    A regionalized variable, thickness of the reservoir layer, from a gas field is decomposed by factorial kriging analysis. Maps of the obtained components may be associated with depositional environments that are favorable for petroleum exploration.

  13. Applying Galois compliance for data analysis in information systems

    Directory of Open Access Journals (Sweden)

    Kozlov Sergey

    2016-03-01

    Full Text Available The article deals with the data analysis in information systems. The author discloses the possibility of using Galois compliance to identify the characteristics of the information system structure. The author reveals the specificity of the application of Galois compliance for the analysis of information system content with the use of invariants of graph theory. Aspects of introduction of mathematical apparatus of Galois compliance for research of interrelations between elements of the adaptive training information system of individual testing are analyzed.

  14. Analysis of a Reflectarray by Using an Iterative Domain Decomposition Technique

    Directory of Open Access Journals (Sweden)

    Carlos Delgado

    2012-01-01

    Full Text Available We present an efficient method for the analysis of different objects that may contain a complex feeding system and a reflector structure. The approach is based on a domain decomposition technique that divides the geometry into several parts to minimize the vast computational resources required when applying a full wave method. This technique is also parallelized by using the Message Passing Interface to minimize the memory and time requirements of the simulation. A reflectarray analysis serves as an example of the proposed approach.

  15. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  16. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  17. Applied network security monitoring collection, detection, and analysis

    CERN Document Server

    Sanders, Chris

    2013-01-01

    Applied Network Security Monitoring is the essential guide to becoming an NSM analyst from the ground up. This book takes a fundamental approach to NSM, complete with dozens of real-world examples that teach you the key concepts of NSM. Network security monitoring is based on the principle that prevention eventually fails. In the current threat landscape, no matter how much you try, motivated attackers will eventually find their way into your network. At that point, it is your ability to detect and respond to that intrusion that can be the difference between a small incident and a major di

  18. Evaluation of bond strength and thickness of adhesive layer according to the techniques of applying adhesives in composite resin restorations.

    Science.gov (United States)

    de Menezes, Fernando Carlos Hueb; da Silva, Stella Borges; Valentino, Thiago Assunção; Oliveira, Maria Angélica Hueb de Menezes; Rastelli, Alessandra Nara de Souza; Conçalves, Luciano de Souza

    2013-01-01

    Adhesive restorations have increasingly been used in dentistry, and the adhesive system application technique may determine the success of the restorative procedure. The aim of this study was to evaluate the influence of the application technique of two adhesive systems (Clearfil SE Bond and Adper Scotchbond MultiPurpose) on the bond strength and adhesive layer of composite resin restorations. Eight human third molars were selected and prepared with Class I occlusal cavities. The teeth were restored with composite using various application techniques for both adhesives, according to the following groups (n = 10): group 1 (control), systems were applied and adhesive was immediately light activated for 20 seconds without removing excesses; group 2, excess adhesive was removed with a gentle jet of air for 5 seconds; group 3, excess was removed with a dry microbrushtype device; and group 4, a gentle jet of air was applied after the microbrush and then light activation was performed. After this, the teeth were submitted to microtensile testing. For the two systems tested, no statistical differences were observed between groups 1 and 2. Groups 3 and 4 presented higher bond strength values compared with the other studied groups, allowing the conclusion that excess adhesive removal with a dry microbrush could improve bond strength in composite restorations. Predominance of adhesive fracture and thicker adhesive layer were observed via scanning electron microscopy (SEM) in groups 1 and 2. For groups 3 and 4, a mixed failure pattern and thinner adhesive layer were verified. Clinicians should be aware that excess adhesive may negatively affect bond strength, whereas a thin, uniform adhesive layer appears to be favorable.

  19. Memory Forensics: Review of Acquisition and Analysis Techniques

    Science.gov (United States)

    2013-11-01

    types of digital evidence investigated include images, text, video and audio files [1]. To date, digital forensic investigations have focused on the...UNCLASSIFIED Memory Forensics : Review of Acquisition and Analysis Techniques Grant Osborne Cyber and Electronic Warfare Division Defence Science and...Technology Organisation DSTO–GD–0770 ABSTRACT This document presents an overview of the most common memory forensics techniques used in the

  20. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  1. Analysis On Classification Techniques In Mammographic Mass Data Set

    OpenAIRE

    K.K.Kavitha; Dr.A.Kangaiammal

    2015-01-01

    Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such a...

  2. A new technique to measure the neutralizer cell gas line density applied to a DIII-D neutral beamline

    Energy Technology Data Exchange (ETDEWEB)

    Kessler, D.N.; Hong, R.M.; Riggs, S.P.

    1995-10-01

    The DIII-D tokamak employs eight ion sources for plasma heating. In order to obtain the maximum neutralization of energetic ions (providing maximum neutral beam power) and reduce the heat load on beamline internal components caused by residual energetic ions, sufficient neutral gas must be injected into the beamline neutralizer cell. The neutral gas flow rate must be optimized, however, since excessive gas will increase power losses due to neutral beam scattering and reionization. It is important, therefore, to be able to determine the neutralizer cell gas line density. A new technique which uses the ion source suppressor grid current to obtain the neutralizer cell gas line density has been developed. The technique uses the fact that slow ions produced by beam-gas interactions in the neutralizer cell during beam extraction are attracted to the negative potential applied to the suppressor grid, inducing current flow in the grid. By removing the dependence on beam energy and beam current a normalized suppressor grid current function can be formed which is dependent only on the gas line density. With this technique it is possible to infer the gas line density on a shot by shot basis.

  3. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  4. Structured Analysis and Supervision Applied on Heavy Fuel Oil Tanks

    Directory of Open Access Journals (Sweden)

    LAKHOUA Mohamed Najeh

    2016-05-01

    Full Text Available This paper introduces the need for structured analysis and real time (SA-RT method of controlcommand applications in a thermal power plant (TPP using a supervisory control and data acquisition system (SCADA. Then, the architecture of a SCADA system in a TPP is presented. A significant example of a control-command application is presented. It is about the heavy fuel oil tanks of a TPP. Then an application of a structured analysis method, generally used in industry, on the basis of the SA-RT formalism is presented. In fact, different modules are represented and described: Context Diagram, Data Flows Diagram, Control Flows Diagram, State Transition Diagram, Timing Specifications and Requirements Dictionary. Finally, this functional and operational analysis allows us to assist the different steps of the specification, the programming and the configuration of a new tabular in a SCADA system.

  5. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  6. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  7. Applying Content Analysis to Web-based Content

    OpenAIRE

    Kim, Inhwa; Kuljis, Jasna

    2010-01-01

    Using Content Analysis onWeb-based content, in particular the content available onWeb 2.0 sites, is investigated. The relative strengths and limitations of the method are described. To illustrate how content analysis may be used, we provide a brief overview of a case study that investigates cultural impacts on the use of design features with regard to self-disclosure on the blogs of South Korean and United Kingdom’s users. In this study we took a standard approach to conducting the content an...

  8. Applying Adult Learning Theory through a Character Analysis

    Science.gov (United States)

    Baskas, Richard S.

    2011-01-01

    The purpose of this study is to analyze the behavior of a character, Celie, in a movie, 'The Color Purple," through the lens of two adult learning theorists to determine the relationships the character has with each theory. The development and portrayal of characters in movies can be explained and understood by the analysis of adult learning…

  9. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    Science.gov (United States)

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  10. Applying an Activity System to Online Collaborative Group Work Analysis

    Science.gov (United States)

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  11. Action, Content and Identity in Applied Genre Analysis for ESP

    Science.gov (United States)

    Flowerdew, John

    2011-01-01

    Genres are staged, structured, communicative events, motivated by various communicative purposes, and performed by members of specific discourse communities (Swales 1990; Bhatia 1993, 2004; Berkenkotter & Huckin 1995). Since its inception, with the two seminal works on the topic by Swales (1990) and Bhatia (1993), genre analysis has taken pride of…

  12. Structural dynamic responses analysis applying differential quadrature method

    Institute of Scientific and Technical Information of China (English)

    PU Jun-ping; ZHENG Jian-jun

    2006-01-01

    Unconditionally stable higher-order accurate time step integration algorithms based on the differential quadrature method (DQM) for second-order initial value problems were applied and the quadrature rules of DQM, computing of the weighting coefficients and choices of sampling grid points were discussed. Some numerical examples dealing with the heat transfer problem, the second-order differential equation of imposed vibration of linear single-degree-of-freedom systems and double-degree-of-freedom systems, the nonlinear move differential equation and a beam forced by a changing load were computed,respectively. The results indicated that the algorithm can produce highly accurate solutions with minimal time consumption, and that the system total energy can remain conservative in the numerical computation.

  13. A Comparative Analysis of Techniques for PAPR Reduction of OFDM Signals

    Directory of Open Access Journals (Sweden)

    M. Janjić

    2014-06-01

    Full Text Available In this paper the problem of high Peak-to-Average Power Ratio (PAPR in Orthogonal Frequency-Division Multiplexing (OFDM signals is studied. Besides describing three techniques for PAPR reduction, SeLective Mapping (SLM, Partial Transmit Sequence (PTS and Interleaving, a detailed analysis of the performances of these techniques for various values of relevant parameters (number of phase sequences, number of interleavers, number of phase factors, number of subblocks depending on applied technique, is carried out. Simulation of these techniques is run in Matlab software. Results are presented in the form of Complementary Cumulative Distribution Function (CCDF curves for PAPR of 30000 randomly generated OFDM symbols. Simulations are performed for OFDM signals with 32 and 256 subcarriers, oversampled by a factor of 4. A detailed comparison of these techniques is made based on Matlab simulation results.

  14. Analysis of Far-Field Radiation from Apertures Using Monte Carlo Integration Technique

    Directory of Open Access Journals (Sweden)

    Mohammad Mehdi Fakharian

    2014-12-01

    Full Text Available An integration technique based on the use of Monte Carlo Integration (MCI is proposed for the analysis of the electromagnetic radiation from apertures. The technique that can be applied to the calculation of the aperture antenna radiation patterns is the equivalence principle followed by physical optics, which can then be used to compute far-field antenna radiation patterns. However, this technique is often complex mathematically, because it requires integration over the closed surface. This paper presents an extremely simple formulation to calculate the far-fields from some types of aperture radiators by using MCI technique. The accuracy and effectiveness of this technique are demonstrated in three cases of radiation from the apertures and results are compared with the solutions using FE simulation and Gaussian quadrature rules.

  15. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  16. Orthogonal projection (OP) technique applied to pattern recognition of fingerprints of the herbal medicine houttuynia cordata Thunb. and its final injection products.

    Science.gov (United States)

    Zeng, Zhong-Da; Liang, Yi-Zeng; Zhang, Ting; Chau, Foo-Tim; Wang, Ya-Li

    2006-05-01

    It is a crucial issue to determine the origins of herbal medicinal materials and identify the quality grades and fakes of their final products collected from different pharmaceutical corporations. Pattern recognition technique may assist the manufacturers to achieve this purpose and effectively control the quality of their products. In this work, a widely used method in chemometrics, orthogonal projection (OP) technique, was applied to discrimination analysis and identification of fingerprints of the herbal medicine houttuynia cordata Thunb. (HCT) and its final injection products. The advantages of the OP technique are clearly shown after comparing with the conventional methods such as principal component analysis (PCA), Mahalanobis distance (MD), and similarity comparison method (SCM). Three different sources of medicinal material HCT and its final injection products from six different manufacturers were studied under 'sixfold', 'threefold' and 'threefold-bis' cross-validation procedures. The good performance of the proposed method in determination and identification of unknown samples shows it could be a powerful tool for quality control in herbal medicine production and other related research fields.

  17. A Spatial Lattice Model Applied for Meteorological Visualization and Analysis

    Directory of Open Access Journals (Sweden)

    Mingyue Lu

    2017-03-01

    Full Text Available Meteorological information has obvious spatial-temporal characteristics. Although it is meaningful to employ a geographic information system (GIS to visualize and analyze the meteorological information for better identification and forecasting of meteorological weather so as to reduce the meteorological disaster loss, modeling meteorological information based on a GIS is still difficult because meteorological elements generally have no stable shape or clear boundary. To date, there are still few GIS models that can satisfy the requirements of both meteorological visualization and analysis. In this article, a spatial lattice model based on sampling particles is proposed to support both the representation and analysis of meteorological information. In this model, a spatial sampling particle is regarded as the basic element that contains the meteorological information, and the location where the particle is placed with the time mark. The location information is generally represented using a point. As these points can be extended to a surface in two dimensions and a voxel in three dimensions, if these surfaces and voxels can occupy a certain space, then this space can be represented using these spatial sampling particles with their point locations and meteorological information. In this case, the full meteorological space can then be represented by arranging numerous particles with their point locations in a certain structure and resolution, i.e., the spatial lattice model, and extended at a higher resolution when necessary. For practical use, the meteorological space is logically classified into three types of spaces, namely the projection surface space, curved surface space, and stereoscopic space, and application-oriented spatial lattice models with different organization forms of spatial sampling particles are designed to support the representation, inquiry, and analysis of meteorological information within the three types of surfaces. Cases

  18. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  19. Applying Cognitive Work Analysis to Time Critical Targeting Functionality

    Science.gov (United States)

    2004-10-01

    Target List/Dynamic Target Queue (DTL/ DTQ ) in the same place. Figure 4-27 shows the task steps involved in achieving Goal 7. 4- 30 Figure 4-27...GUI WG to brainstorm the order of columns in the DTL/ DTQ Table, a critical component of the TCTF CUI, with successful results, which were...Cognitive Work Analysis DTD Display Task Description DTL/ DTQ Dynamic Target List/Dynamic Target Queue FDO Fighter Duty Officer FEBA Forward Edge

  20. Flow analysis techniques as effective tools for the improved environmental analysis of organic compounds expressed as total indices.

    Science.gov (United States)

    Maya, Fernando; Estela, José Manuel; Cerdà, Víctor

    2010-04-15

    The scope of this work is the accomplishment of an overview about the current state-of-the-art flow analysis techniques applied to the environmental determination of organic compounds expressed as total indices. Flow analysis techniques are proposed as effective tools for the quick obtention of preliminary chemical information about the occurrence of organic compounds on the environment prior to the use of more complex, time-consuming and expensive instrumental techniques. Recently improved flow-based methodologies for the determination of chemical oxygen demand, halogenated organic compounds and phenols are presented and discussed in detail. The aim of the present work is to demonstrate the highlight of flow-based techniques as vanguard tools on the determination of organic compounds in environmental water samples.

  1. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  2. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  3. Geological-geophysical techniques applied to urban planning in karst hazardous areas. Case study of Zaragoza, NE Spain

    Science.gov (United States)

    Pueyo Anchuela, O.; Soriano, A.; Casas Sainz, A.; Pocoví Juan, A.

    2009-12-01

    Industrial and urban growth must deal in some settings with geological hazards. In the last 50 years, the city of Zaragoza (NE Spain) has developed an increase of its urbanized area in a progression several orders higher than expected from its population increase. This fast growth has affected several areas around the city that were not usually used for construction. Maps of the Zaragoza city area at the end of the XIXth century and beginning of the XXth reveal the presence of karst hazards in several zones that can be observed in more modern data, as aerial photographs taken during a period ranging from 1927 to present. The urban and industrial development has covered many of these hazardous zones, even though potential risks were known. The origins of the karst problems are related to the solution of evaporites (mainly gypsum, glauberite and halite) that represent the Miocene substratum of the Zaragoza area underlying the Quaternary terraces and pediments related to the Ebro River and its tributaries. Historical data show the persistence of subsidence foci during long periods of time while in recent urbanized areas this stability is not shared, observing the increase of activity and/or radius affection in short periods of time after building over. These problems can be related to two factors: i) urban development over hazardous areas can increase the karst activity and ii) the affection radius is not properly established with the commonly applied methods. One way to develop these detailed maps can be related to the geophysical approach. The applied geophysical routine, dependent on the characteristics of the surveyed area, is based on potential geophysical techniques (magnetometry and gravimetry) and others related to the application of induced fields (EM and GPR). The obtained results can be related to more straightforward criteria as the detection of cavities in the subsoil and indirect indicators related to the long-term activity of the subsidence areas

  4. Chemical analysis applied to the radiation sterilization of solid ketoprofen

    Science.gov (United States)

    Colak, S.; Maquille, A.; Tilquin, B.

    2006-01-01

    The aim of this work is to investigate the feasibility of radiation sterilization of ketoprofen from a chemical point of view. Although irradiated ketoprofen has already been studied in the literature [Katusin-Razem et al., Radiat. Phys. Chem. 73 111-116 (2005)], new results, on the basis of electron spin resonance (ESR) measurements and the use of hyphenated techniques (GC-MS and LC-MS), are obtained. The ESR spectra of irradiated ketoprofen consists of four unresolved resonance peaks and the mean G-value of ketoprofen is found to be 4 +/- 0.9 nmoles/J, which is very small. HPLC-UV analyses indicate that no significant loss of ketoprofen is detected after irradiation. LC-MS-MS analyses show that the structures of the non-volatile final products are similar to ketoprofen. Benzaldehyde is detected in the irradiated samples after dynamic-extraction GC-MS. The analyses show that ketoprofen is radioresistant and therefore might be radiosterilized.

  5. Nondestructive methods of analysis applied to oriental swords

    Directory of Open Access Journals (Sweden)

    Edge, David

    2015-12-01

    Full Text Available Various neutron techniques were employed at the Budapest Nuclear Centre in an attempt to find the most useful method for analysing the high-carbon steels found in Oriental arms and armour, such as those in the Wallace Collection, London. Neutron diffraction was found to be the most useful in terms of identifying such steels and also indicating the presence of hidden patternEn el Centro Nuclear de Budapest se han empleado varias técnicas neutrónicas con el fin de encontrar un método adecuado para analizar las armas y armaduras orientales con un alto contenido en carbono, como algunas de las que se encuentran en la Colección Wallace de Londres. El empleo de la difracción de neutrones resultó ser la técnica más útil de cara a identificar ese tipo de aceros y también para encontrar patrones escondidos.

  6. Applying Cuckoo Search for analysis of LFSR based cryptosystem

    Directory of Open Access Journals (Sweden)

    Maiya Din

    2016-09-01

    Full Text Available Cryptographic techniques are employed for minimizing security hazards to sensitive information. To make the systems more robust, cyphers or crypts being used need to be analysed for which cryptanalysts require ways to automate the process, so that cryptographic systems can be tested more efficiently. Evolutionary algorithms provide one such resort as these are capable of searching global optimal solution very quickly. Cuckoo Search (CS Algorithm has been used effectively in cryptanalysis of conventional systems like Vigenere and Transposition cyphers. Linear Feedback Shift Register (LFSR is a crypto primitive used extensively in design of cryptosystems. In this paper, we analyse LFSR based cryptosystem using Cuckoo Search to find correct initial states of used LFSR. Primitive polynomials of degree 11, 13, 17 and 19 are considered to analyse text crypts of length 200, 300 and 400 characters. Optimal solutions were obtained for the following CS parameters: Levy distribution parameter (β = 1.5 and Alien eggs discovering probability (pa = 0.25.

  7. A Multifactorial Analysis of Reconstruction Methods Applied After Total Gastrectomy

    Directory of Open Access Journals (Sweden)

    Oktay Büyükaşık

    2010-12-01

    Full Text Available Aim: The aim of this study was to evaluate the reconstruction methods applied after total gastrectomy in terms of postoperative symptomology and nutrition. Methods: This retrospective study was conducted on 31 patients who underwent total gastrectomy due to gastric cancer in 2. Clinic of General Surgery, SSK Ankara Training Hospital. 6 different reconstruction methods were used and analyzed in terms of age, sex and postoperative complications. One from esophagus and two biopsy specimens from jejunum were taken through upper gastrointestinal endoscopy from all cases, and late period morphological and microbiological changes were examined. Postoperative weight change, dumping symptoms, reflux esophagitis, solid/liquid dysphagia, early satiety, postprandial pain, diarrhea and anorexia were assessed. Results: Of 31 patients,18 were males and 13 females; the youngest one was 33 years old, while the oldest- 69 years old. It was found that reconstruction without pouch was performed in 22 cases and with pouch in 9 cases. Early satiety, postprandial pain, dumping symptoms, diarrhea and anemia were found most commonly in cases with reconstruction without pouch. The rate of bacterial colonization of the jejunal mucosa was identical in both groups. Reflux esophagitis was most commonly seen in omega esophagojejunostomy (EJ, while the least-in Roux-en-Y, Tooley and Tanner 19 EJ. Conclusion: Reconstruction with pouch performed after total gastrectomy is still a preferable method. (The Medical Bulletin of Haseki 2010; 48:126-31

  8. Bayesian Information-Gap Decision Analysis Applied to a CO2 Leakage Problem

    Science.gov (United States)

    O'Malley, D.; Vesselinov, V. V.

    2014-12-01

    We describe a decision analysis in the presence of uncertainty that combines a non-probabilistic approach (information-gap decision theory) with a probabilistic approach (Bayes' theorem). Bayes' theorem is one of the most popular techniques for probabilistic uncertainty quantification (UQ). It is effective in many situations, because it updates our understanding of the uncertainties by conditioning on real data using a mathematically rigorous technique. However, the application of Bayes' theorem in science and engineering is not always rigorous. There are two reasons for this: (1) We can enumerate the possible outcomes of dice-rolling, but not the possible outcomes of real-world contamination remediation; (2) We can precisely determine conditional probabilities for coin-tossing, but substantial uncertainty surrounds the conditional probabilities for real-world contamination remediation. Of course, Bayes' theorem is rigorously applicable beyond dice-rolling and coin-tossing, but even in cases that are constructed to be simple with ostensibly good probabilistic models, applying Bayes' theorem to the real world may not work as well as one might expect. Bayes' theorem is rigorously applicable only if all possible events can be described, and their conditional probabilities can be derived rigorously. Outside of this domain, it may still be useful, but its use lacks at least some rigor. The information-gap approach allows us to circumvent some of the highlighted shortcomings of Bayes' theorem. In particular, it provides a way to account for possibilities beyond those described by our models, and a way to deal with uncertainty in the conditional distribution that forms the core of Bayesian analysis. We have developed a three-tiered technique enables one to make scientifically defensible decisions in the face of severe uncertainty such as is found in many geologic problems. To demonstrate the applicability, we apply the technique to a CO2 leakage problem. The goal is to

  9. Applying temporal network analysis to the venture capital market

    Science.gov (United States)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  10. Methods of analysis applied on the e-shop Arsta

    OpenAIRE

    Flégl, Jan

    2013-01-01

    Bachelor thesis is focused on summarizing methods of e-shop analysis. The first chapter summarizes and describes the basics of e-commerce and e-shops in general. The second chapter deals with search engines, their functioning and in what ways it is possible to influence the order of search results. Special attention is paid to the optimization and search engine marketing. The third chapter summarizes basic tools of the Google Analytics. The fourth chapter uses findings of all the previous cha...

  11. Changes in elongation of falx cerebri during craniosacral therapy techniques applied on the skull of an embalmed cadaver.

    Science.gov (United States)

    Kostopoulos, D C; Keramidas, G

    1992-01-01

    Craniosacral therapy supports that light forces applied to the skull may be transmitted to the dura membrane having a therapeutic effect to the cranial system. This study examines the changes in elongation of falx cerebri during the application of some of the craniosacral therapy techniques to the skull of an embalmed cadaver. The study demonstrates that the relative elongation of the falx cerebri changes as follows: for the frontal lift, 1.44 mm; for the parietal lift, 1.08 mm; for the sphenobasilar compression, -0.33 mm; for the sphenobasilar decompression, 0.28 mm; and for the ear pull, inconclusive results. The present study offers validation for the scientific basis of craniosacral therapy and the contention for cranial suture mobility.

  12. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  13. Advanced spatio-temporal filtering techniques for photogrammetric image sequence analysis in civil engineering material testing

    Science.gov (United States)

    Liebold, F.; Maas, H.-G.

    2016-01-01

    The paper shows advanced spatial, temporal and spatio-temporal filtering techniques which may be used to reduce noise effects in photogrammetric image sequence analysis tasks and tools. As a practical example, the techniques are validated in a photogrammetric spatio-temporal crack detection and analysis tool applied in load tests in civil engineering material testing. The load test technique is based on monocular image sequences of a test object under varying load conditions. The first image of a sequence is defined as a reference image under zero load, wherein interest points are determined and connected in a triangular irregular network structure. For each epoch, these triangles are compared to the reference image triangles to search for deformations. The result of the feature point tracking and triangle comparison process is a spatio-temporally resolved strain value field, wherein cracks can be detected, located and measured via local discrepancies. The strains can be visualized as a color-coded map. In order to improve the measuring system and to reduce noise, the strain values of each triangle must be treated in a filtering process. The paper shows the results of various filter techniques in the spatial and in the temporal domain as well as spatio-temporal filtering techniques applied to these data. The best results were obtained by a bilateral filter in the spatial domain and by a spatio-temporal EOF (empirical orthogonal function) filtering technique.

  14. Application of a sensitivity analysis technique to high-order digital flight control systems

    Science.gov (United States)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  15. Dynamical Systems Analysis Applied to Working Memory Data

    Directory of Open Access Journals (Sweden)

    Fidan eGasimova

    2014-07-01

    Full Text Available In the present paper we investigate weekly fluctuations in the working memory capacity (WMC assessed over a period of two years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure’s performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions.

  16. Operational modal analysis applied to the concert harp

    Science.gov (United States)

    Chomette, B.; Le Carrou, J.-L.

    2015-05-01

    Operational modal analysis (OMA) methods are useful to extract modal parameters of operating systems. These methods seem to be particularly interesting to investigate the modal basis of string instruments during operation to avoid certain disadvantages due to conventional methods. However, the excitation in the case of string instruments is not optimal for OMA due to the presence of damped harmonic components and low noise in the disturbance signal. Therefore, the present study investigates the least-square complex exponential (LSCE) and the modified least-square complex exponential methods in the case of a string instrument to identify modal parameters of the instrument when it is played. The efficiency of the approach is experimentally demonstrated on a concert harp excited by some of its strings and the two methods are compared to a conventional modal analysis. The results show that OMA allows us to identify modes particularly present in the instrument's response with a good estimation especially if they are close to the excitation frequency with the modified LSCE method.

  17. Data analysis techniques for nuclear and particle physicists

    CERN Document Server

    Pruneau, Claude

    2017-01-01

    This is an advanced data analysis textbook for scientists specializing in the areas of particle physics, nuclear physics, and related subfields. As a practical guide for robust, comprehensive data analysis, it focuses on realistic techniques to explain instrumental effects. The topics are relevant for engineers, scientists, and astroscientists working in the fields of geophysics, chemistry, and the physical sciences. The book serves as a reference for more senior scientists while being eminently accessible to advanced undergraduate and graduate students.

  18. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    Science.gov (United States)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  19. Sensitivity Analysis Applied in Design of Low Energy Office Building

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik

    2008-01-01

    Building performance can be expressed by different indicators as primary energy use, environmental load and/or the indoor environmental quality and a building performance simulation can provide the decision maker with a quantitative measure of the extent to which an integrated design solution...... satisfies the design requirements and objectives. In the design of sustainable Buildings it is beneficial to identify the most important design parameters in order to develop more efficiently alternative design solutions or reach optimized design solutions. A sensitivity analysis makes it possible...... to identify the most important parameters in relation to building performance and to focus design and optimization of sustainable buildings on these fewer, but most important parameters. The sensitivity analyses will typically be performed at a reasonably early stage of the building design process, where...

  20. Downside Risk analysis applied to the Hedge Funds universe

    Science.gov (United States)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.