WorldWideScience

Sample records for source analysis technique

  1. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  2. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yonggang

    2018-05-07

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integrated analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.

  3. Source-jerk analysis using a semi-explicit inverse kinetic technique

    International Nuclear Information System (INIS)

    Spriggs, G.D.; Pederson, R.A.

    1985-01-01

    A method is proposed for measuring the effective reproduction factor, k, in subcritical systems. The method uses the transient response of a subcritical system to the sudden removal of an extraneous neutron source (i.e., a source jerk). The response is analyzed using an inverse kinetic technique that least-squares fits the exact analytical solution corresponding to a source-jerk transient as derived from the point-reactor model. It has been found that the technique can provide an accurate means of measuring k in systems that are close to critical (i.e., 0.95 < k < 1.0). As a system becomes more subcritical (i.e., k << 1.0) spatial effects can introduce significant biases depending on the source and detector positions. However, methods are available that can correct for these biases and, hence, can allow measuring subcriticality in systems with k as low as 0.5. 12 refs., 3 figs

  4. Source-jerk analysis using a semi-explicit inverse kinetic technique

    International Nuclear Information System (INIS)

    Spriggs, G.D.; Pederson, R.A.

    1985-01-01

    A method is proposed for measuring the effective reproduction factor, k, in subcritical systems. The method uses the transient responses of a subcritical system to the sudden removal of an extraneous neutron source (i.e., a source jerk). The response is analyzed using an inverse kinetic technique that least-squares fits the exact analytical solution corresponding to a source-jerk transient as derived from the point-reactor model. It has been found that the technique can provide an accurate means of measuring k in systems that are close to critical (i.e., 0.95 < k < 1.0). As a system becomes more subcritical (i.e., k << 1.0) spatial effects can introduce significant biases depending on the source and detector positions. However, methods are available that can correct for these biases and, hence, can allow measuring subcriticality in systems with k as low as 0.5

  5. Point-source inversion techniques

    Science.gov (United States)

    Langston, Charles A.; Barker, Jeffrey S.; Pavlin, Gregory B.

    1982-11-01

    A variety of approaches for obtaining source parameters from waveform data using moment-tensor or dislocation point source models have been investigated and applied to long-period body and surface waves from several earthquakes. Generalized inversion techniques have been applied to data for long-period teleseismic body waves to obtain the orientation, time function and depth of the 1978 Thessaloniki, Greece, event, of the 1971 San Fernando event, and of several events associated with the 1963 induced seismicity sequence at Kariba, Africa. The generalized inversion technique and a systematic grid testing technique have also been used to place meaningful constraints on mechanisms determined from very sparse data sets; a single station with high-quality three-component waveform data is often sufficient to discriminate faulting type (e.g., strike-slip, etc.). Sparse data sets for several recent California earthquakes, for a small regional event associated with the Koyna, India, reservoir, and for several events at the Kariba reservoir have been investigated in this way. Although linearized inversion techniques using the moment-tensor model are often robust, even for sparse data sets, there are instances where the simplifying assumption of a single point source is inadequate to model the data successfully. Numerical experiments utilizing synthetic data and actual data for the 1971 San Fernando earthquake graphically demonstrate that severe problems may be encountered if source finiteness effects are ignored. These techniques are generally applicable to on-line processing of high-quality digital data, but source complexity and inadequacy of the assumed Green's functions are major problems which are yet to be fully addressed.

  6. Analysis of jet-airfoil interaction noise sources by using a microphone array technique

    Science.gov (United States)

    Fleury, Vincent; Davy, Renaud

    2016-03-01

    The paper is concerned with the characterization of jet noise sources and jet-airfoil interaction sources by using microphone array data. The measurements were carried-out in the anechoic open test section wind tunnel of Onera, Cepra19. The microphone array technique relies on the convected, Lighthill's and Ffowcs-Williams and Hawkings' acoustic analogy equation. The cross-spectrum of the source term of the analogy equation is sought. It is defined as the optimal solution to a minimal error equation using the measured microphone cross-spectra as reference. This inverse problem is ill-posed yet. A penalty term based on a localization operator is therefore added to improve the recovery of jet noise sources. The analysis of isolated jet noise data in subsonic regime shows the contribution of the conventional mixing noise source in the low frequency range, as expected, and of uniformly distributed, uncorrelated noise sources in the jet flow at higher frequencies. In underexpanded supersonic regime, a shock-associated noise source is clearly identified, too. An additional source is detected in the vicinity of the nozzle exit both in supersonic and subsonic regimes. In the presence of the airfoil, the distribution of the noise sources is deeply modified. In particular, a strong noise source is localized on the flap. For high Strouhal numbers, higher than about 2 (based on the jet mixing velocity and diameter), a significant contribution from the shear-layer near the flap is observed, too. Indications of acoustic reflections on the airfoil are also discerned.

  7. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  8. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  9. Blind source separation dependent component analysis

    CERN Document Server

    Xiang, Yong; Yang, Zuyuan

    2015-01-01

    This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.

  10. A Systematic Review of Techniques and Sources of Big Data in the Healthcare Sector.

    Science.gov (United States)

    Alonso, Susel Góngora; de la Torre Díez, Isabel; Rodrigues, Joel J P C; Hamrioui, Sofiane; López-Coronado, Miguel

    2017-10-14

    The main objective of this paper is to present a review of existing researches in the literature, referring to Big Data sources and techniques in health sector and to identify which of these techniques are the most used in the prediction of chronic diseases. Academic databases and systems such as IEEE Xplore, Scopus, PubMed and Science Direct were searched, considering the date of publication from 2006 until the present time. Several search criteria were established as 'techniques' OR 'sources' AND 'Big Data' AND 'medicine' OR 'health', 'techniques' AND 'Big Data' AND 'chronic diseases', etc. Selecting the paper considered of interest regarding the description of the techniques and sources of Big Data in healthcare. It found a total of 110 articles on techniques and sources of Big Data on health from which only 32 have been identified as relevant work. Many of the articles show the platforms of Big Data, sources, databases used and identify the techniques most used in the prediction of chronic diseases. From the review of the analyzed research articles, it can be noticed that the sources and techniques of Big Data used in the health sector represent a relevant factor in terms of effectiveness, since it allows the application of predictive analysis techniques in tasks such as: identification of patients at risk of reentry or prevention of hospital or chronic diseases infections, obtaining predictive models of quality.

  11. Crime analysis using open source information

    DEFF Research Database (Denmark)

    Nizamani, Sarwat; Memon, Nasrullah; Shah, Azhar Ali

    2015-01-01

    In this paper, we present a method of crime analysis from open source information. We employed un-supervised methods of data mining to explore the facts regarding the crimes of an area of interest. The analysis is based on well known clustering and association techniques. The results show...

  12. Apportionment of urban aerosol sources in Cork (Ireland) by synergistic measurement techniques.

    Science.gov (United States)

    Dall'Osto, Manuel; Hellebust, Stig; Healy, Robert M; O'Connor, Ian P; Kourtchev, Ivan; Sodeau, John R; Ovadnevaite, Jurgita; Ceburnis, Darius; O'Dowd, Colin D; Wenger, John C

    2014-09-15

    The sources of ambient fine particulate matter (PM2.5) during wintertime at a background urban location in Cork city (Ireland) have been determined. Aerosol chemical analyses were performed by multiple techniques including on-line high resolution aerosol time-of-flight mass spectrometry (Aerodyne HR-ToF-AMS), on-line single particle aerosol time-of-flight mass spectrometry (TSI ATOFMS), on-line elemental carbon-organic carbon analysis (Sunset_EC-OC), and off-line gas chromatography/mass spectrometry and ion chromatography analysis of filter samples collected at 6-h resolution. Positive matrix factorization (PMF) has been carried out to better elucidate aerosol sources not clearly identified when analyzing results from individual aerosol techniques on their own. Two datasets have been considered: on-line measurements averaged over 2-h periods, and both on-line and off-line measurements averaged over 6-h periods. Five aerosol sources were identified by PMF in both datasets, with excellent agreement between the two solutions: (1) regional domestic solid fuel burning--"DSF_Regional," 24-27%; (2) local urban domestic solid fuel burning--"DSF_Urban," 22-23%; (3) road vehicle emissions--"Traffic," 15-20%; (4) secondary aerosols from regional anthropogenic sources--"SA_Regional" 9-13%; and (5) secondary aged/processed aerosols related to urban anthropogenic sources--"SA_Urban," 21-26%. The results indicate that, despite regulations for restricting the use of smoky fuels, solid fuel burning is the major source (46-50%) of PM2.5 in wintertime in Cork, and also likely other areas of Ireland. Whilst wood combustion is strongly associated with OC and EC, it was found that peat and coal combustion is linked mainly with OC and the aerosol from these latter sources appears to be more volatile than that produced by wood combustion. Ship emissions from the nearby port were found to be mixed with the SA_Regional factor. The PMF analysis allowed us to link the AMS cooking organic

  13. Review of Sealed Source Designs and Manufacturing Techniques Affecting Disused Source Management

    International Nuclear Information System (INIS)

    2012-10-01

    This publication presents an investigation on the influence of the design and technical features of sealed radioactive sources (SRSs) on predisposal and disposal activities when the sources become disused. The publication also addresses whether design modifications could contribute to safer and/or more efficient management of disused sources without compromising the benefits provided by the use of the sealed sources. This technical publication aims to collect information on the most typical design features and manufacturing techniques of sealed radioactive sources and examines how they affect the safe management of disused sealed radioactive sources (DSRS). The publication also aims to assist source designers and manufacturers by discussing design features that are important from the waste management point of view. It has been identified that most SRS manufacturers use similar geometries and materials for their designs and apply improved and reliable manufacturing techniques e.g. double- encapsulation. These designs and manufacturing techniques have been proven over time to reduce contamination levels in fabrication and handling, and improve source integrity and longevity. The current source designs and materials ensure as well as possible that SRSs will maintain their integrity in use and when they become disused. No significant improvement options to current designs have been identified. However, some design considerations were identified as important to facilitate source retrieval, to increase the possibility of re-use and to ensure minimal contamination risk and radioactive waste generation at recycling. It was also concluded that legible identifying markings on a source are critical for DSRS management. The publication emphasizes the need for a common understanding of the radioactive source's recommended working life (RWL) for manufacturers and regulators. The conditions of use (COU) are important for the determination of RWL. A formal system for specification

  14. VLBI FOR GRAVITY PROBE B. IV. A NEW ASTROMETRIC ANALYSIS TECHNIQUE AND A COMPARISON WITH RESULTS FROM OTHER TECHNIQUES

    International Nuclear Information System (INIS)

    Lebach, D. E.; Ratner, M. I.; Shapiro, I. I.; Bartel, N.; Bietenholz, M. F.; Lederman, J. I.; Ransom, R. R.; Campbell, R. M.; Gordon, D.; Lestrade, J.-F.

    2012-01-01

    When very long baseline interferometry (VLBI) observations are used to determine the position or motion of a radio source relative to reference sources nearby on the sky, the astrometric information is usually obtained via (1) phase-referenced maps or (2) parametric model fits to measured fringe phases or multiband delays. In this paper, we describe a 'merged' analysis technique which combines some of the most important advantages of these other two approaches. In particular, our merged technique combines the superior model-correction capabilities of parametric model fits with the ability of phase-referenced maps to yield astrometric measurements of sources that are too weak to be used in parametric model fits. We compare the results from this merged technique with the results from phase-referenced maps and from parametric model fits in the analysis of astrometric VLBI observations of the radio-bright star IM Pegasi (HR 8703) and the radio source B2252+172 nearby on the sky. In these studies we use central-core components of radio sources 3C 454.3 and B2250+194 as our positional references. We obtain astrometric results for IM Peg with our merged technique even when the source is too weak to be used in parametric model fits, and we find that our merged technique yields astrometric results superior to the phase-referenced mapping technique. We used our merged technique to estimate the proper motion and other astrometric parameters of IM Peg in support of the NASA/Stanford Gravity Probe B mission.

  15. Application of energy dispersive x-ray techniques for water analysis

    International Nuclear Information System (INIS)

    Funtua, I. I.

    2000-07-01

    Energy dispersive x-ray fluorescence (EDXRF) is a class of emission spectroscopic techniques that depends upon the emission of characteristic x-rays following excitation of the atomic electron energy levels by tube or isotopic source x-rays. The technique has found wide range of applications that include determination of chemical elements of water and water pollutants. Three EDXRF systems, the isotopic source, secondary target and total reflection (TXRF) are available at the Centre for Energy research and Training. These systems have been applied for the analysis of sediments, suspensions, ground water, river and rainwater. The isotopic source is based on 55 Fe, 109 Cd and 241 Am excitations while the secondary target and the total reflection are utilizing a Mo x-ray tube. Sample preparation requirements for water analysis range from physical and chemical pre-concentration steps to direct analysis and elements from Al to U can be determined with these systems. The EDXRF techniques, TXRF in particular with its multielement capability, low detection limit and possibility of direct analysis for water have competitive edge over the traditional methods of atomic absorption and flame photometry

  16. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  17. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  18. Laser sources and techniques for spectroscopy and dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Kung, A.H. [Lawrence Berkeley Laboratory, CA (United States)

    1993-12-01

    This program focuses on the development of novel laser and spectroscopic techniques in the IR, UV, and VUV regions for studying combustion related molecular dynamics at the microscopic level. Laser spectroscopic techniques have proven to be extremely powerful in the investigation of molecular processes which require very high sensitivity and selectivity. The authors approach is to use quantum electronic and non-linear optical techniques to extend the spectral coverage and to enhance the optical power of ultrahigh resolution laser sources so as to obtain and analyze photoionization, fluorescence, and photoelectron spectra of jet-cooled free radicals and of reaction products resulting from unimolecular and bimolecular dissociations. New spectroscopic techniques are developed with these sources for the detection of optically thin and often short-lived species. Recent activities center on regenerative amplification of high resolution solid-state lasers, development of tunable high power mid-IR lasers and short-pulse UV/VUV tunable lasers, and development of a multipurpose high-order suppressor crossed molecular beam apparatus for use with synchrotron radiation sources. This program also provides scientific and technical support within the Chemical Sciences Division to the development of LBL`s Combustion Dynamics Initiative.

  19. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Shukri Mohd

    2013-01-01

    Full-text: Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed Wavelet Transform analysis and Modal Location (WTML) based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) technique and DeltaTlocation. The results of the study show that the WTML method produces more accurate location results compared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure. (author)

  20. Time domain localization technique with sparsity constraint for imaging acoustic sources

    Science.gov (United States)

    Padois, Thomas; Doutres, Olivier; Sgard, Franck; Berry, Alain

    2017-09-01

    This paper addresses source localization technique in time domain for broadband acoustic sources. The objective is to accurately and quickly detect the position and amplitude of noise sources in workplaces in order to propose adequate noise control options and prevent workers hearing loss or safety risk. First, the generalized cross correlation associated with a spherical microphone array is used to generate an initial noise source map. Then a linear inverse problem is defined to improve this initial map. Commonly, the linear inverse problem is solved with an l2 -regularization. In this study, two sparsity constraints are used to solve the inverse problem, the orthogonal matching pursuit and the truncated Newton interior-point method. Synthetic data are used to highlight the performances of the technique. High resolution imaging is achieved for various acoustic sources configurations. Moreover, the amplitudes of the acoustic sources are correctly estimated. A comparison of computation times shows that the technique is compatible with quasi real-time generation of noise source maps. Finally, the technique is tested with real data.

  1. Absolute calibration technique for spontaneous fission sources

    International Nuclear Information System (INIS)

    Zucker, M.S.; Karpf, E.

    1984-01-01

    An absolute calibration technique for a spontaneously fissioning nuclide (which involves no arbitrary parameters) allows unique determination of the detector efficiency for that nuclide, hence of the fission source strength

  2. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  3. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  4. Application of source biasing technique for energy efficient DECODER circuit design: memory array application

    Science.gov (United States)

    Gupta, Neha; Parihar, Priyanka; Neema, Vaibhav

    2018-04-01

    Researchers have proposed many circuit techniques to reduce leakage power dissipation in memory cells. If we want to reduce the overall power in the memory system, we have to work on the input circuitry of memory architecture i.e. row and column decoder. In this research work, low leakage power with a high speed row and column decoder for memory array application is designed and four new techniques are proposed. In this work, the comparison of cluster DECODER, body bias DECODER, source bias DECODER, and source coupling DECODER are designed and analyzed for memory array application. Simulation is performed for the comparative analysis of different DECODER design parameters at 180 nm GPDK technology file using the CADENCE tool. Simulation results show that the proposed source bias DECODER circuit technique decreases the leakage current by 99.92% and static energy by 99.92% at a supply voltage of 1.2 V. The proposed circuit also improves dynamic power dissipation by 5.69%, dynamic PDP/EDP 65.03% and delay 57.25% at 1.2 V supply voltage.

  5. Efficacy of Blood Sources and Artificial Blood Feeding Methods in Rearing of Aedes aegypti (Diptera: Culicidae for Sterile Insect Technique and Incompatible Insect Technique Approaches in Sri Lanka

    Directory of Open Access Journals (Sweden)

    Nayana Gunathilaka

    2017-01-01

    Full Text Available Introduction. Selection of the artificial membrane feeding technique and blood meal source has been recognized as key considerations in mass rearing of vectors. Methodology. Artificial membrane feeding techniques, namely, glass plate, metal plate, and Hemotek membrane feeding method, and three blood sources (human, cattle, and chicken were evaluated based on feeding rates, fecundity, and hatching rates of Aedes aegypti. Significance in the variations among blood feeding was investigated by one-way ANOVA, cluster analysis of variance (ANOSIM, and principal coordinates (PCO analysis. Results. Feeding rates of Ae. aegypti significantly differed among the membrane feeding techniques as suggested by one-way ANOVA (p0.05. Conclusions. Metal plate method could be recommended as the most effective membrane feeding technique for mass rearing of Ae. aegypti, due to its high feeding rate and cost effectiveness. Cattle blood could be recommended for mass rearing Ae. aegypti.

  6. A New Technique to Identify Arbitrarily Shaped Noise Sources

    Directory of Open Access Journals (Sweden)

    Roberto A. Tenenbaum

    2006-01-01

    Full Text Available Acoustic intensity is one of the available tools for evaluating sound radiation from vibrating bodies. Active intensity may, in some situations, not give a faithful insight about how much energy is in fact carried into the far field. It was then proposed a new parameter, the supersonic acoustic intensity, which takes into account only the intensity generated by components having a smaller wavenumber than the acoustic one. However, the method is only efective for simple sources, such as plane plates, cylinders and spheres. This work presents a new technique, based on the Boundary Elements Method and the Singular Value Decomposition, to compute the supersonic acoustic intensity for arbitrarily shaped sources. The technique is based in the Kirchoff-Helmholtz equation in a discretized approach, leading to a radiation operator that relates the normal velocity on the source's surface mesh with the pressure at grid points located in the field. Then, the singular value decomposition technique is set to the radiation operator and a cutoff criterion is applied to remove non propagating components. Some numerical examples are presented.

  7. Radioactive source calibration technique for the CMS hadron calorimeter

    Energy Technology Data Exchange (ETDEWEB)

    Hazen, E.; Lawlor, C.; Rohlf, J.W. E-mail: rohlf@bu.edu; Wu, S.X.; Baumbaugh, A.; Elias, J.E.; Freeman, J.; Green, D.; Lazic, D.; Los, S.; Ronzhin, A.; Sergueev, S.; Shaw, T.; Vidal, R.; Whitmore, J.; Zimmerman, T.; Adams, M.; Burchesky, K.; Qian, W.; Baden, A.; Bard, R.; Breden, H.; Grassi, T.; Skuja, A.; Fisher, W.; Mans, J.; Tully, C.; Barnes, V.; Laasanen, A.; Barbaro, P. de; Budd, H

    2003-10-01

    Relative calibration of the scintillator tiles used in the hadronic calorimeter for the Compact Muon Solenoid detector at the CERN Large Hadron Collider is established and maintained using a radioactive source technique. A movable source can be positioned remotely to illuminate each scintillator tile individually, and the resulting photo-detector current is measured to provide the relative calibration. The unique measurement technique described here makes use of the normal high-speed data acquisition system required for signal digitization at the 40 MHz collider frequency. The data paths for collider measurements and source measurements are then identical, and systematic uncertainties associated with having different signal paths are avoided. In this high-speed mode, the source signal is observed as a Poisson photo-electron distribution with a mean that is smaller than the width of the electronics noise (pedestal) distribution. We report demonstration of the technique using prototype electronics for the complete readout chain and show the typical response observed with a 144 channel test beam system. The electronics noise has a root-mean-square of 1.6 least counts, and a 1 mCi source produces a shift of the mean value of 0.1 least counts. Because of the speed of the data acquisition system, this shift can be measured to a statistical precision better than a fraction of a percent on a millisecond time scale. The result is reproducible to better than 2% over a time scale of 1 month.

  8. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Mohd, Shukri [Nondestructive Testing Group, Industrial Technology Division, Malaysian Nuclear Agency, 43000, Bangi, Selangor (Malaysia); Holford, Karen M.; Pullin, Rhys [Cardiff School of Engineering, Cardiff University, Queen' s Buildings, The Parade, CARDIFF CF24 3AA (United Kingdom)

    2014-02-12

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure.

  9. Continuous wavelet transform analysis and modal location analysis acoustic emission source location for nuclear piping crack growth monitoring

    International Nuclear Information System (INIS)

    Mohd, Shukri; Holford, Karen M.; Pullin, Rhys

    2014-01-01

    Source location is an important feature of acoustic emission (AE) damage monitoring in nuclear piping. The ability to accurately locate sources can assist in source characterisation and early warning of failure. This paper describe the development of a novelAE source location technique termed 'Wavelet Transform analysis and Modal Location (WTML)' based on Lamb wave theory and time-frequency analysis that can be used for global monitoring of plate like steel structures. Source location was performed on a steel pipe of 1500 mm long and 220 mm outer diameter with nominal thickness of 5 mm under a planar location test setup using H-N sources. The accuracy of the new technique was compared with other AE source location methods such as the time of arrival (TOA) techniqueand DeltaTlocation. Theresults of the study show that the WTML method produces more accurate location resultscompared with TOA and triple point filtering location methods. The accuracy of the WTML approach is comparable with the deltaT location method but requires no initial acoustic calibration of the structure

  10. Currents trends in the application of IBA techniques to air pollution source fingerprinting and source apportionment

    International Nuclear Information System (INIS)

    Cohen, David; Stelcer, Ed.; Atanacio, Armand; Crawford, Jagoda

    2013-01-01

    Full text: IBA techniques have been used for many years to characterise fine particle air pollution. This is not new the techniques are well established. Typically 2-3 MeV protons are used to bombard thin filter papers and up to four simultaneous techniques like PIXE, PIGE, RBS and ERDA will be applied to obtain (μg/g) concentrations for elements from hydrogen to lead. Generally low volume samplers are used to sample between 20-30 m 3 of air over a 24 hour period, this together with IBA's sensitivity means that concentrations down to 1 ng/m 3 of air sampled can be readily achieved with only a few minutes of proton irradiation. With these short irradiation times and low sensitivities for a broad range of elements in the periodic table, large numbers of samples can be obtained and analysed very quickly and easily. At ANSTO we have used IBA methods to acquire a database of over 50,000 filters from 85 different sites through Australia and Asia, each filter has been analysed for more than 21 different chemical species. Large databases extending over many years means that modern statistical techniques like positive matrix factorisation (PMF) can be used to define well characterised source fingerprints and source contributions for a range of different fine particle air pollutants. In this paper we will discuss these PMF techniques and show how they identify both natural sources like sea spray and windblown soils as well as anthropogenic sources like automobiles, biomass burning, coal-fired power stations and industrial emissions. These data are particularly useful for Governments, EPA's and managers of pollution to better understanding pollution sources and their relative contributions and hence to better manage air pollution. Current trends are to take these IBA and PMF techniques a step further and to combine them with wind speed and back trajectory data to better pin point and identify emission sources. We show how this is now being applied on both a local

  11. Currents trends in the application of IBA techniques to air pollution source fingerprinting and source apportionment

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, David; Stelcer, Ed.; Atanacio, Armand; Crawford, Jagoda [Australian Nuclear Science and Technology Organisation, Kirrawee DC (Australia)

    2013-07-01

    Full text: IBA techniques have been used for many years to characterise fine particle air pollution. This is not new the techniques are well established. Typically 2-3 MeV protons are used to bombard thin filter papers and up to four simultaneous techniques like PIXE, PIGE, RBS and ERDA will be applied to obtain (μg/g) concentrations for elements from hydrogen to lead. Generally low volume samplers are used to sample between 20-30 m{sup 3} of air over a 24 hour period, this together with IBA's sensitivity means that concentrations down to 1 ng/m{sup 3} of air sampled can be readily achieved with only a few minutes of proton irradiation. With these short irradiation times and low sensitivities for a broad range of elements in the periodic table, large numbers of samples can be obtained and analysed very quickly and easily. At ANSTO we have used IBA methods to acquire a database of over 50,000 filters from 85 different sites through Australia and Asia, each filter has been analysed for more than 21 different chemical species. Large databases extending over many years means that modern statistical techniques like positive matrix factorisation (PMF) can be used to define well characterised source fingerprints and source contributions for a range of different fine particle air pollutants. In this paper we will discuss these PMF techniques and show how they identify both natural sources like sea spray and windblown soils as well as anthropogenic sources like automobiles, biomass burning, coal-fired power stations and industrial emissions. These data are particularly useful for Governments, EPA's and managers of pollution to better understanding pollution sources and their relative contributions and hence to better manage air pollution. Current trends are to take these IBA and PMF techniques a step further and to combine them with wind speed and back trajectory data to better pin point and identify emission sources. We show how this is now being applied on both

  12. An application of time-frequency signal analysis technique to estimate the location of an impact source on a plate type structure

    International Nuclear Information System (INIS)

    Park, Jin Ho; Lee, Jeong Han; Choi, Young Chul; Kim, Chan Joong; Seong, Poong Hyun

    2005-01-01

    It has been reviewed whether it would be suitable that the application of the time-frequency signal analysis techniques to estimate the location of the impact source in plate structure. The STFT(Short Time Fourier Transform), WVD(Wigner-Ville distribution) and CWT(Continuous Wavelet Transform) methods are introduced and the advantages and disadvantages of those methods are described by using a simulated signal component. The essential of the above proposed techniques is to separate the traveling waves in both time and frequency domains using the dispersion characteristics of the structural waves. These time-frequency methods are expected to be more useful than the conventional time domain analyses for the impact localization problem on a plate type structure. Also it has been concluded that the smoothed WVD can give more reliable means than the other methodologies for the location estimation in a noisy environment

  13. Review on solving the inverse problem in EEG source analysis

    Directory of Open Access Journals (Sweden)

    Fabri Simon G

    2008-11-01

    Full Text Available Abstract In this primer, we give a review of the inverse problem for EEG source localization. This is intended for the researchers new in the field to get insight in the state-of-the-art techniques used to find approximate solutions of the brain sources giving rise to a scalp potential recording. Furthermore, a review of the performance results of the different techniques is provided to compare these different inverse solutions. The authors also include the results of a Monte-Carlo analysis which they performed to compare four non parametric algorithms and hence contribute to what is presently recorded in the literature. An extensive list of references to the work of other researchers is also provided. This paper starts off with a mathematical description of the inverse problem and proceeds to discuss the two main categories of methods which were developed to solve the EEG inverse problem, mainly the non parametric and parametric methods. The main difference between the two is to whether a fixed number of dipoles is assumed a priori or not. Various techniques falling within these categories are described including minimum norm estimates and their generalizations, LORETA, sLORETA, VARETA, S-MAP, ST-MAP, Backus-Gilbert, LAURA, Shrinking LORETA FOCUSS (SLF, SSLOFO and ALF for non parametric methods and beamforming techniques, BESA, subspace techniques such as MUSIC and methods derived from it, FINES, simulated annealing and computational intelligence algorithms for parametric methods. From a review of the performance of these techniques as documented in the literature, one could conclude that in most cases the LORETA solution gives satisfactory results. In situations involving clusters of dipoles, higher resolution algorithms such as MUSIC or FINES are however preferred. Imposing reliable biophysical and psychological constraints, as done by LAURA has given superior results. The Monte-Carlo analysis performed, comparing WMN, LORETA, sLORETA and SLF

  14. Nondestructive analysis of the natural uranium mass through the measurement of delayed neutrons using the technique of pulsed neutron source

    International Nuclear Information System (INIS)

    Coelho, Paulo Rogerio Pinto

    1979-01-01

    This work presents results of non destructive mass analysis of natural uranium by the pulsed source technique. Fissioning is produced by irradiating the test sample with pulses of 14 MeV neutrons and the uranium mass is calculated on a relative scale from the measured emission of delayed neutrons. Individual measurements were normalised against the integral counts of a scintillation detector measuring the 14 MeV neutron intensity. Delayed neutrons were measured using a specially constructed slab detector operated in anti synchronism with the fast pulsed source. The 14 MeV neutrons were produced via the T(d,n) 4 He reaction using a 400 kV Van de Graaff accelerated operated at 200 kV in the pulsed source mode. Three types of sample were analysed, namely: discs of metallic uranium, pellets of sintered uranium oxide and plates of uranium aluminium alloy sandwiched between aluminium. These plates simulated those of Material Testing Reactor fuel elements. Results of measurements were reproducible to within an overall error in the range 1.6 to 3.9%; the specific error depending on the shape, size and mass of the sample. (author)

  15. Elemental analysis of the ancient bronze coins by x-ray fluorescence technique using simultaneously radioisotope source and x-ray tube

    International Nuclear Information System (INIS)

    Nguyen The Quynh; Truong Thi An; Tran Duc Thiep; Nguyen Dinh Chien; Dao Tran Cao; Nguyen Quang Liem

    2004-01-01

    The results on elemental analysis of the Vietnamese ancient bronze coins during the time of the Nguyen dynasty (19th century) are presented. The samples were provided by the vietnam National Historical Museum and the elemental analysis was performed on the home-made model EDS-XT-99-01 X-ray fluorescence spectrometer in the Institute of Materials Science, NCST of Vietnam. The samples exited simultaneously by radioisotope source and X-ray tube. The analytical results show the similarity in the elemental composition of the coins issued by different kings of the Nguyen dynasty, but there is the difference in the concentration of the used elements. Another interesting point is that all the coins have zinc (Zn) in their composition, which shows clearly the influence of the occidental metallurgical technology on the money-making technique in Vietnam during the 19th century. (author)

  16. Digital Control Techniques Based on Voltage Source Inverters in Renewable Energy Applications: A Review

    Directory of Open Access Journals (Sweden)

    Sohaib Tahir

    2018-02-01

    Full Text Available In the modern era, distributed generation is considered as an alternative source for power generation. Especially, need of the time is to provide the three-phase loads with smooth sinusoidal voltages having fixed frequency and amplitude. A common solution is the integration of power electronics converters in the systems for connecting distributed generation systems to the stand-alone loads. Thus, the presence of suitable control techniques, in the power electronic converters, for robust stability, abrupt response, optimal tracking ability and error eradication are inevitable. A comprehensive review based on design, analysis, validation of the most suitable digital control techniques and the options available for the researchers for improving the power quality is presented in this paper with their pros and cons. Comparisons based on the cost, schemes, performance, modulation techniques and coordinates system are also presented. Finally, the paper describes the performance evaluation of the control schemes on a voltage source inverter (VSI and proposes the different aspects to be considered for selecting a power electronics inverter topology, reference frames, filters, as well as control strategy.

  17. Efficacy of Blood Sources and Artificial Blood Feeding Methods in Rearing of Aedes aegypti (Diptera: Culicidae) for Sterile Insect Technique and Incompatible Insect Technique Approaches in Sri Lanka.

    Science.gov (United States)

    Gunathilaka, Nayana; Ranathunge, Tharaka; Udayanga, Lahiru; Abeyewickreme, Wimaladharma

    2017-01-01

    Selection of the artificial membrane feeding technique and blood meal source has been recognized as key considerations in mass rearing of vectors. Artificial membrane feeding techniques, namely, glass plate, metal plate, and Hemotek membrane feeding method, and three blood sources (human, cattle, and chicken) were evaluated based on feeding rates, fecundity, and hatching rates of Aedes aegypti . Significance in the variations among blood feeding was investigated by one-way ANOVA, cluster analysis of variance (ANOSIM), and principal coordinates (PCO) analysis. Feeding rates of Ae. aegypti significantly differed among the membrane feeding techniques as suggested by one-way ANOVA ( p feeding technique. Blood feeding rate of Ae. aegypti was higher with human blood followed by cattle and chicken blood, respectively. However, no significant difference was observed from the mosquitoes fed with cattle and human blood, in terms of fecundity, oviposition rate, and fertility as suggested by one-way ANOVA ( p > 0.05). Metal plate method could be recommended as the most effective membrane feeding technique for mass rearing of Ae. aegypti , due to its high feeding rate and cost effectiveness. Cattle blood could be recommended for mass rearing Ae. aegypti .

  18. The application of IBA techniques to air pollution source fingerprinting and source apportionment

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D., E-mail: dcz@ansto.gov.au; Stelcer, E.; Atanacio, A.; Crawford, J.

    2014-01-01

    IBA techniques have been used to measure elemental concentrations of more than 20 different elements found in fine particle (PM2.5) air pollution. These data together with their errors and minimum detectable limits were used in Positive Matrix Factorisation (PMF) analyses to quantitatively determine source fingerprints and their contributions to the total measured fine mass. Wind speed and direction back trajectory data from the global HYSPLIT codes were then linked to these PMF fingerprints to quantitatively identify the location of the sources.

  19. Failure analysis of radioisotopic heat source capsules tested under multi-axial conditions

    International Nuclear Information System (INIS)

    Zielinski, R.E.; Stacy, E.; Burgan, C.E.

    In order to qualify small radioisotopic heat sources for a 25-yr design life, multi-axial mechanical tests were performed on the structural components of the heat source. The results of these tests indicated that failure predominantly occurred in the middle of the weld ramp-down zone. Examination of the failure zone by standard metallographic techniques failed to indicate the true cause of failure. A modified technique utilizing chemical etching, scanning electron microscopy, and energy dispersive x-ray analysis was employed and dramatically indicated the true cause of failure, impurity concentration in the ramp-down zone. As a result of the initial investigation, weld parameters for the heat sources were altered. Example welds made with a pulse arc technique did not have this impurity buildup in the ramp-down zone

  20. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  1. Spatial assessment and source identification of heavy metals pollution in surface water using several chemometric techniques.

    Science.gov (United States)

    Ismail, Azimah; Toriman, Mohd Ekhwan; Juahir, Hafizan; Zain, Sharifuddin Md; Habir, Nur Liyana Abdul; Retnam, Ananthy; Kamaruddin, Mohd Khairul Amri; Umar, Roslan; Azid, Azman

    2016-05-15

    This study presents the determination of the spatial variation and source identification of heavy metal pollution in surface water along the Straits of Malacca using several chemometric techniques. Clustering and discrimination of heavy metal compounds in surface water into two groups (northern and southern regions) are observed according to level of concentrations via the application of chemometric techniques. Principal component analysis (PCA) demonstrates that Cu and Cr dominate the source apportionment in northern region with a total variance of 57.62% and is identified with mining and shipping activities. These are the major contamination contributors in the Straits. Land-based pollution originating from vehicular emission with a total variance of 59.43% is attributed to the high level of Pb concentration in the southern region. The results revealed that one state representing each cluster (northern and southern regions) is significant as the main location for investigating heavy metal concentration in the Straits of Malacca which would save monitoring cost and time. The monitoring of spatial variation and source of heavy metals pollution at the northern and southern regions of the Straits of Malacca, Malaysia, using chemometric analysis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Modal parameter identification based on combining transmissibility functions and blind source separation techniques

    Science.gov (United States)

    Araújo, Iván Gómez; Sánchez, Jesús Antonio García; Andersen, Palle

    2018-05-01

    Transmissibility-based operational modal analysis is a recent and alternative approach used to identify the modal parameters of structures under operational conditions. This approach is advantageous compared with traditional operational modal analysis because it does not make any assumptions about the excitation spectrum (i.e., white noise with a flat spectrum). However, common methodologies do not include a procedure to extract closely spaced modes with low signal-to-noise ratios. This issue is relevant when considering that engineering structures generally have closely spaced modes and that their measured responses present high levels of noise. Therefore, to overcome these problems, a new combined method for modal parameter identification is proposed in this work. The proposed method combines blind source separation (BSS) techniques and transmissibility-based methods. Here, BSS techniques were used to recover source signals, and transmissibility-based methods were applied to estimate modal information from the recovered source signals. To achieve this combination, a new method to define a transmissibility function was proposed. The suggested transmissibility function is based on the relationship between the power spectral density (PSD) of mixed signals and the PSD of signals from a single source. The numerical responses of a truss structure with high levels of added noise and very closely spaced modes were processed using the proposed combined method to evaluate its ability to identify modal parameters in these conditions. Colored and white noise excitations were used for the numerical example. The proposed combined method was also used to evaluate the modal parameters of an experimental test on a structure containing closely spaced modes. The results showed that the proposed combined method is capable of identifying very closely spaced modes in the presence of noise and, thus, may be potentially applied to improve the identification of damping ratios.

  3. COMPARISON OF RECURSIVE ESTIMATION TECHNIQUES FOR POSITION TRACKING RADIOACTIVE SOURCES

    International Nuclear Information System (INIS)

    Muske, K.; Howse, J.

    2000-01-01

    This paper compares the performance of recursive state estimation techniques for tracking the physical location of a radioactive source within a room based on radiation measurements obtained from a series of detectors at fixed locations. Specifically, the extended Kalman filter, algebraic observer, and nonlinear least squares techniques are investigated. The results of this study indicate that recursive least squares estimation significantly outperforms the other techniques due to the severe model nonlinearity

  4. Rascal: A domain specific language for source code analysis and manipulation

    NARCIS (Netherlands)

    P. Klint (Paul); T. van der Storm (Tijs); J.J. Vinju (Jurgen); A. Walenstein; S. Schuppe

    2009-01-01

    htmlabstractMany automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This

  5. RASCAL : a domain specific language for source code analysis and manipulationa

    NARCIS (Netherlands)

    Klint, P.; Storm, van der T.; Vinju, J.J.

    2009-01-01

    Many automated software engineering tools require tight integration of techniques for source code analysis and manipulation. State-of-the-art tools exist for both, but the domains have remained notoriously separate because different computational paradigms fit each domain best. This impedance

  6. Determination of boron in water solution by an indirect neutron activation technique from a 241Am/Be source

    International Nuclear Information System (INIS)

    Sales, H.B.

    1981-08-01

    Boron content in water solutions has been analysed by Indirect Activation Technique a twin 241 Am/Be neutron source with a source strength of 9x10 6 n/seg. The boron concentration was inferred from the measurement of the activity induced in a vanadium flux monitor. The vanadium rod was located inside the boron solution in a standart geometrical set up with respect to the neutron source. Boron concentrations in the range of 100 to 1000 ppm were determined with an overall accuracy of about 2% during a total analysis time of about 20 minutes. Eventhough the analysis is not selective for boron yet due the rapid, simple and precise nature, it is proposed for the analysis of boron in the primary coolant circuit of Nuclear Power Plants of PWR type. (Author) [pt

  7. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  8. Sources of uncertainty in individual monitoring for photographic,TL and OSL dosimetry techniques

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, Max S.; Silva, Everton R.; Mauricio, Claudia L.P., E-mail: max.das.ferreira@gmail.com, E-mail: everton@ird.gov.br, E-mail: claudia@ird.gov.br [Instituto de Radioprotecao e Dosimetria (IRD/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The identification of the uncertainty sources and their quantification is essential to the quality of any dosimetric results. If uncertainties are not stated for all dose measurements informed in the monthly dose report to the monitored radiation facilities, they need to be known. This study aims to analyze the influence of different sources of uncertainties associated with photographic, TL and OSL dosimetric techniques, considering the evaluation of occupational doses of whole-body exposure for photons. To identify the sources of uncertainty it was conducted a bibliographic review in specific documents that deal with operational aspects of each technique and the uncertainties associated to each of them. Withal, technical visits to individual monitoring services were conducted to assist in this identification. The sources of uncertainty were categorized and their contributions were expressed in a qualitative way. The process of calibration and traceability are the most important sources of uncertainties, regardless the technique used. For photographic dosimetry, the remaining important uncertainty sources are due to: energy and angular dependence; linearity of response; variations in the films processing. For TL and OSL, the key process for a good performance is respectively the reproducibility of the thermal and optical cycles. For the three techniques, all procedures of the measurement process must be standardized, controlled and reproducible. Further studies can be performed to quantify the contribution of the sources of uncertainty. (author)

  9. Validation of the direct analysis in real time source for use in forensic drug screening.

    Science.gov (United States)

    Steiner, Robert R; Larson, Robyn L

    2009-05-01

    The Direct Analysis in Real Time (DART) ion source is a relatively new mass spectrometry technique that is seeing widespread use in chemical analyses world-wide. DART studies include such diverse topics as analysis of flavors and fragrances, melamine in contaminated dog food, differentiation of writing inks, characterization of solid counterfeit drugs, and as a detector for planar chromatography. Validation of this new technique for the rapid screening of forensic evidence for drugs of abuse, utilizing the DART source coupled to an accurate mass time-of-flight mass spectrometer, was conducted. The study consisted of the determination of the lower limit of detection for the method, determination of selectivity and a comparison of this technique to established analytical protocols. Examples of DART spectra are included. The results of this study have allowed the Virginia Department of Forensic Science to incorporate this new technique into their analysis scheme for the screening of solid dosage forms of drugs of abuse.

  10. STATE OF THE ART TECHNIQUES USED FOR NOISE SOURCE IDENTIFICATION ON COMPLEX BODIES

    Directory of Open Access Journals (Sweden)

    Corneliu STOICA

    2010-03-01

    Full Text Available Over the last few decades, many approaches have been undertaken in order to asses detailed noise source identification on complex bodies, i.e. aircrafts, cars, machinery. Noise source identification implies to accurately obtain the position and frequency of the dominant noise sources. There are cases where traditional testing methods can not be applied at all or their use involves some limitations. Optical systems used for near field analysis require a line of sight that may not be available. The state-of-the-art technology for this purpose is the use of a large number of microphones whose signals are acquired simultaneously, i.e. microphone phased array. Due to the excessive cost of the instruments and the data acquisition system required, the implementation of this technology was restricted to governmental agencies (NASA, DLR and big companies such as Boeing and Airbus. During the past years, this technique was developed in wind tunnels and some universities to perform noise source identification on scale airframes, main landing gear models, and aerodynamic profiles (used on airplanes, helicopter rotors and wind mills.

  11. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  12. Automation of neutral beam source conditioning with artificial intelligence techniques

    International Nuclear Information System (INIS)

    Johnson, R.R.; Canales, T.W.; Lager, D.L.

    1985-01-01

    This paper describes a system that automates neutral beam source conditioning. The system achieves this with artificial intelligence techniques. The architecture of the system is presented followed by a description of its performance

  13. Automation of neutral beam source conditioning with artificial intelligence techniques

    International Nuclear Information System (INIS)

    Johnson, R.R.; Canales, T.; Lager, D.

    1986-01-01

    This paper describes a system that automates neutral beam source conditioning. The system achieves this with artificial intelligence techniques. The architecture of the system is presented followed by a description of its performance

  14. Agronomic evaluation of guano sources by means of isotope techniques

    International Nuclear Information System (INIS)

    Zapata, F.; Arrillaga, J.L.

    2002-01-01

    Many soils of the tropics and subtropics under continuous cultivation are very infertile, thus poor yields are obtained and little crop residues remain to protect the soils from degrading erosion. External nutrient inputs in the form of chemical fertilizers, organic materials and other nutrients sources are essential for developing sustainable agricultural production systems. As chemical fertilizers are costly for developing countries with insufficient foreign currency for their purchase and their supplies are limited and irregular for small landholders, alternative nutrient sources must be sought and evaluated for use in dominant agricultural production systems. Locally available organic materials of different origin are potential sources of nutrients. One such source with high agronomic potential is guano. The present study was carried out to evaluate the agronomic effectiveness of two guano materials of different origin (Zaire and Peru) as sources of nitrogen and phosphorus as compared to chemical fertilizers (ammonium sulfate and triple superphosphate) using isotopic ( 15 N and 32 P) techniques. Using the classical method of comparing dry matter weight and P uptake, no significant differences among the tested guano sources were found. The use of the isotopic techniques allowed a quantitative assessment of the N and P supply to crops. Both guano materials were found to be good sources of N but in contrast were poor sources of phosphorus. In addition, from the agronomic evaluation, it was found that the guano of Zaire and the ammonium sulfate were N sources of equivalent efficiency and the guano of Peru even slightly better than the ammonium sulfate. As expected, P in the single superphosphate was as available to the P in the triple superphosphate. However, the substitution ratios for the guano sources were relatively high. Thus, 1 kg P as single superphosphate was equivalent to 9.5 kg P as guano from Zaire or 12.5 kg P as guano from Peru. Further field trials in

  15. Polarisation analysis of elastic neutron scattering using a filter spectrometer on a pulsed source

    International Nuclear Information System (INIS)

    Mayers, J.; Williams, W.G.

    1981-05-01

    The experimental and theoretical aspects of the polarisation analysis technique in elastic neutron scattering are described. An outline design is presented for a filter polarisation analysis spectrometer on the Rutherford Laboratory Spallation Neutron Source and estimates made of its expected count rates and resolution. (author)

  16. Thickness measurement for the different metals by using Cs-137 gamma source with gamma transmission technique

    International Nuclear Information System (INIS)

    Bueyuek, B.; Tugrul, B.

    2009-01-01

    The purpose of this study is an experimental analysis of thickness measurement for various metals with the gamma transmission technique using Cs-137 as the radioisotope source. Lead, steel, brass, and aluminum, which are frequently used metals in industry, were chosen for the experiments. As the radioisotope source Cs-137 was preferred for the study since it has long half-life, it is mono energetic, and it penetrates the metals that were studied. Experiments were observed in the constant experimental geometry. Calibration curves for the four metal samples were plotted using the obtained results. To test the plotted calibration curves, counts for determining thickness measurement were collected for each sample and the obtained relative count values were used in conjunction with the plotted calibration curves for each sample to determine its thickness. The thicknesses of the samples have been measured with a micrometer and the results were comparatively analyzed with the measurement results obtained by the gamma transmission technique. The results of the analyses revealed that the thickness measurement values obtained with the gamma transmission technique and the thickness measurement values obtained with the conventional technique significantly converge to each other and the difference between the two values is at an acceptable level. Therefore the reliability of thickness measurements with the gamma transmission technique and the resulting calibration curves have been demonstrated.

  17. Scintillating confusion: Evaluation of a technique for measuring compact structure in weak radio sources

    International Nuclear Information System (INIS)

    Spangler, S.R.; Cordes, J.M.; Meyers, K.A.

    1979-01-01

    An attractive scheme for investigating compact structure in weak radio sources is to study the scintillation properties of confusion in a large single-dish radio telescope. We have investigated the utility of this technique by observing the scintillations of 860-MHz confusion of the NRAO 300' (91 m) telescope. Analysis of these data indicated a reduction in the mean scintillation index with decreasing flux density which implied that weaker sources possessed less compact structure. More direct observations indicated that the weak sources of interest were not significantly deficient in compact structure, so the first result is probably due to properties of the IPS process in the strong scintillation regime. Our results may be due to overresolution (by the IPS process in the strong scintillation regime) of the ''hot spots'' responsible for scintillation in most strong sources at frequencies below 1000 MHz, or may indicate abnormally strong turbulence in the solar wind during August, 1977. Future applications of this method would be best conducted at lower frequencies with larger reflectors or short-spacing interferometers

  18. Advanced analysis technique for the evaluation of linear alternators and linear motors

    Science.gov (United States)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  19. The fission-track analysis: An alternative technique for provenance studies of prehistoric obsidian artefacts

    CERN Document Server

    Bellot-Gurlet, L; Dorighel, O; Oddone, M; Poupeau, G; Yegingil, Z

    1999-01-01

    Comparison of fission-track parameters - age and track densities - is an alternative tool for correlating obsidian artefacts with their potential natural sources. This method was applied by different fission-track groups in various regions and results were compared with those obtained using the more popular provenance identification techniques based on chemical composition studies. Hundreds of analyses prove that fission-track dating is a complementary technique which turns out to be very useful, specially when the chemical composition does not fully discriminate different sources. Archaeologically significant results were obtained applying the fission-track analysis in various regions of earth.

  20. The fission-track analysis: An alternative technique for provenance studies of prehistoric obsidian artefacts

    International Nuclear Information System (INIS)

    Bellot-Gurlet, L.; Bigazzi, G.; Dorighel, O.; Oddone, M.; Poupeau, G.; Yegingil, Z.

    1999-01-01

    Comparison of fission-track parameters - age and track densities - is an alternative tool for correlating obsidian artefacts with their potential natural sources. This method was applied by different fission-track groups in various regions and results were compared with those obtained using the more popular provenance identification techniques based on chemical composition studies. Hundreds of analyses prove that fission-track dating is a complementary technique which turns out to be very useful, specially when the chemical composition does not fully discriminate different sources. Archaeologically significant results were obtained applying the fission-track analysis in various regions of earth

  1. Analysis of obsydians and films of silicon carbide by RBS technique

    International Nuclear Information System (INIS)

    Franco S, F.

    1998-01-01

    Motivated by archaeological interest this work is presented, which consist in the characterization of obsydian samples from different mineral sites in Mexico and films of silicon carbide, undertaken by an Ion Beam Analysis: RBS (Rutherford Back Scattering). As part of an intensive investigation of obsydian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in Central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In the first part of this work, the non-destructive IBA technique, RBS are used to analyze obsydian samples. The last part is an analysis of thin films of silicon carbide as a part of a research program of the Universidad Nacional Autonoma de Mexico and ININ. The application of this technique were carried out at the IF-UNAM, and the analysis was performed at laboratories of the ININ Nuclear Centre facilities. The samples considered in this work were mounted on a sample holder designed for the purpose of exposing each sample to the alpha particles beam. This RBS analysis was carried out with an ET Tandem accelerator at the IF UNAM. The spectrometry was carried out with employing a Si(Li) detector set at 15 degrees in relation to the target normal. The mean projectile energy was 2.00 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (MIchoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza (Puebla), Guadalupe Victoria (Puebla) and Oyameles (Puebla). The mean values are accompanied by errors expressed as one standard devistion of the mean for each element

  2. Point source search techniques in ultra high energy gamma ray astronomy

    International Nuclear Information System (INIS)

    Alexandreas, D.E.; Biller, S.; Dion, G.M.; Lu, X.Q.; Yodh, G.B.; Berley, D.; Goodman, J.A.; Haines, T.J.; Hoffman, C.M.; Horch, E.; Sinnis, C.; Zhang, W.

    1993-01-01

    Searches for point astrophysical sources of ultra high energy (UHE) gamma rays are plagued by large numbers of background events from isotropic cosmic rays. Some of the methods that have been used to estimate the expected number of background events coming from the direction of a possible source are found to contain biases. Search techniques that avoid this problem are described. There is also a discussion of how to optimize the sensitivity of a search to emission from a point source. (orig.)

  3. Analysis of two dimensional charged particle scintillation using video image processing techniques

    International Nuclear Information System (INIS)

    Sinha, A.; Bhave, B.D.; Singh, B.; Panchal, C.G.; Joshi, V.M.; Shyam, A.; Srinivasan, M.

    1993-01-01

    A novel method for video recording of individual charged particle scintillation images and their offline analysis using digital image processing techniques for obtaining position, time and energy information is presented . Results of an exploratory experiment conducted using 241 Am and 239 Pu alpha sources are presented. (author). 3 figs., 4 tabs

  4. Frequency spectrum analysis of 252Cf neutron source based on LabVIEW

    International Nuclear Information System (INIS)

    Mi Deling; Li Pengcheng

    2011-01-01

    The frequency spectrum analysis of 252 Cf Neutron source is an extremely important method in nuclear stochastic signal processing. Focused on the special '0' and '1' structure of neutron pulse series, this paper proposes a fast-correlation algorithm to improve the computational rate of the spectrum analysis system. And the multi-core processor technology is employed as well as multi-threaded programming techniques of LabVIEW to construct frequency spectrum analysis system of 252 Cf neutron source based on LabVIEW. It not only obtains the auto-correlation and cross correlation results, but also auto-power spectrum,cross-power spectrum and ratio of spectral density. The results show that: analysis tools based on LabVIEW improve the fast auto-correlation and cross correlation code operating efficiency about by 25% to 35%, also verify the feasibility of using LabVIEW for spectrum analysis. (authors)

  5. Novel technique for coal pyrolysis and hydrogenation production analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1990-01-01

    The overall objective of this study is to establish vacuum ultraviolet photoionization-MS and VUV pulsed EI-MS as useful tools for a simpler and more accurate direct mass spectrometric measurement of a broad range of hydrocarbon compounds in complex mixtures for ultimate application to the study of the kinetics of coal hydrogenation and pyrolysis processes. The VUV-MS technique allows ionization of a broad range of species with minimal fragmentation. Many compounds of interest can be detected with the 118 nm wavelength, but additional compound selectivity is achievable by tuning the wavelength of the photo-ionization source in the VUV. Resonant four wave mixing techniques in Hg vapor will allow near continuous tuning from about 126 to 106 nm. This technique would facilitate the scientific investigation of coal upgrading processes such as pyrolysis and hydrogenation by allowing accurate direct analysis of both stable and intermediate reaction products.

  6. Source-space ICA for MEG source imaging.

    Science.gov (United States)

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  7. Nanopositioning techniques development for synchrotron radiation instrumentation applications at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Shu Deming

    2010-01-01

    At modern synchrotron radiation sources and beamlines, high-precision positioning techniques present a significant opportunity to support state-of-the-art synchrotron radiation research. Meanwhile, the required instrument positioning performance and capabilities, such as resolution, dynamic range, repeatability, speed, and multiple axes synchronization are exceeding the limit of commercial availability. This paper presents the current nanopositioning techniques developed for the Argonne Center for Nanoscale Materials (CNM)/Advanced Photon Source (APS) hard x-ray nanoprobe and high-resolution x-ray monochromators and analyzers for the APS X-ray Operations and Research (XOR) beamlines. Future nanopositioning techniques to be developed for the APS renewal project will also be discussed.

  8. Transport-level description of the 252Cf-source method using the Langevin technique

    International Nuclear Information System (INIS)

    Stolle, A.M.; Akcasu, A.Z.

    1991-01-01

    The fluctuations in the neutron number density and detector outputs in a nuclear reactor can be analyzed conveniently by using the Langevin equation approach. This approach can be implemented at any level of approximation to describe the time evolution of the neutron population, from the most complete transport-level description to the very basic point reactor analysis of neutron number density fluctuations. In this summary, the complete space- and velocity-dependent transport-level formulation of the Langevin equation approach is applied to the analysis of the 252 Cf-source-driven noise analysis (CSDNA) method, an experimental technique developed by J.T. Mihalczo at Oak Ridge National Laboratory, which makes use of noise analysis to determine the reactivity of subcritical media. From this analysis, a theoretical expression for the subcritical multiplication factor is obtained that can then be used to interpret the experimental data. Results at the transport level are in complete agreement with an independent derivation performed by Sutton and Doub, who used the probability density method to interpret the CSDNA experiment, but differed from other expressions that have appeared in the literature

  9. Comparative Study of Modulation Techniques for Two-Level Voltage Source Inverters

    Directory of Open Access Journals (Sweden)

    Barry W. Williams

    2016-06-01

    Full Text Available A detailed comparative study of modulation techniques for single and three phase dc-ac inverters is presented.  Sinusoidal Pulse Width Modulation, Triplen Sinusoidal Pulse Width Modulation, Space Vector Modulation, Selective Harmonic Elimination and Wavelet Modulation are assessed and compared in terms of maximum fundamental output, harmonic performance, switching losses and operational mode.  The presented modulation techniques are applied to single and three phase voltage source inverters and are simulated using SIMULINK.  The simulation results clarify the inverter performance achieved using the different modulations techniques.

  10. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  11. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  12. Analysis of hairy root culture of Rauvolfia serpentina using direct analysis in real time mass spectrometric technique.

    Science.gov (United States)

    Madhusudanan, K P; Banerjee, Suchitra; Khanuja, Suman P S; Chattopadhyay, Sunil K

    2008-06-01

    The applicability of a new mass spectrometric technique, DART (direct analysis in real time) has been studied in the analysis of the hairy root culture of Rauvolfia serpentina. The intact hairy roots were analyzed by holding them in the gap between the DART source and the mass spectrometer for measurements. Two nitrogen-containing compounds, vomilenine and reserpine, were characterized from the analysis of the hairy roots almost instantaneously. The confirmation of the structures of the identified compounds was made through their accurate molecular formula determinations. This is the first report of the application of DART technique for the characterization of compounds that are expressed in the hairy root cultures of Rauvolfia serpentina. Moreover, this also constitutes the first report of expression of reserpine in the hairy root culture of Rauvolfia serpentina. Copyright (c) 2008 John Wiley & Sons, Ltd.

  13. Obisdian sourcing by PIXE analysis at AURA2

    International Nuclear Information System (INIS)

    Neve, S.R.; Barker, P.H.; Holroyd, S.; Sheppard, P.J.

    1994-01-01

    The technique of Proton Induced X-ray Emission is a suitable method for the elemental analysis of obsidian samples and artefacts. By comparing the elemental composition of obsidian artefacts with those of known sources of obsidian and identifying similarities, the likely origin of the sample can be discovered and information about resource procurement gained. A PIXE facility has now been established at the Auckland University Research Accelerator Laboratory, AURA2. It offers a rapid, multi-element, non-destructive method of characterisation of obsidian samples ranging from small chips to large pieces. In an extensive survey of Mayor Island obsidian, a discrimination has been made between the different locations of obsidian deposits on the island. In addition, using the database developed at AURA2, artefacts from the site of Opita, Hauraki Plains, have been sourced. (Author). 18 refs., 8 figs., 7 tabs., 1 appendix

  14. Characterization of sealed radioactive sources. Uncertainty analysis to improve detection methods

    International Nuclear Information System (INIS)

    Cummings, D.G.; Sommers, J.D.; Adamic, M.L.; Jimenez, M.; Giglio, J.J.; Carney, K.P.

    2009-01-01

    A radioactive 137 Cs source has been analyzed for the radioactive parent 137 Cs and stable decay daughter 137 Ba. The ratio of the daughter to parent atoms is used to estimate the date when Cs was purified prior to source encapsulation (an 'age' since purification). The isotopes were analyzed by inductively coupled plasma mass spectrometry (ICP-MS) after chemical separation. In addition, Ba was analyzed by isotope dilution ICP-MS (ID-ICP-MS). A detailed error analysis of the mass spectrometric work has been undertaken to identify areas of improvement, as well as quantifying the effect the errors have on the 'age' determined. This paper reports an uncertainty analysis to identifying areas of improvement and alternative techniques that may reduce the uncertainties. In particular, work on isotope dilution using ICP-MS for the 'age' determination of sealed sources is presented. The results will be compared to the original work done using external standards to calibrate the ICP-MS instrument. (author)

  15. Investigations of Orchestra Auralizations Using the Multi-Channel Multi-Source Auralization Technique

    DEFF Research Database (Denmark)

    Vigeant, Michelle; Wang, Lily M.; Rindel, Jens Holger

    2008-01-01

    a multi-channel multi-source auralization technique, involving individual five-channel anechoic recordings of each instrumental part of two symphonies. In the first study, these auralizations were subjectively compared to orchestra auralizations made using (a) a single omni-directional source, (b......) a surface source, and (c) single-channel multi-source method. Results show that the multi-source auralizations were rated to be more realistic than the surface source ones and to have larger source width than the single omni-directional source auralizations. No significant differences were found between......Room acoustics computer modeling is a tool for generating impulse responses and auralizations from modeled spaces. The auralizations are commonly made from a single-channel anechoic recording of solo instruments. For this investigation, auralizations of an entire orchestra were created using...

  16. Application of PSA techniques to synchrotron radiation source facilities

    International Nuclear Information System (INIS)

    Sanyasi Rao, V.V.S.; Vinod, G.; Vaze, K.K.; Sarkar, P.K.

    2011-01-01

    Synchrotron radiation sources are increasingly being used in research and medical applications. Various instances of overexposure in these facilities have been reported in literature. These instances have lead to the investigation of the risks associated with them with a view to minimise the risks and thereby increasing the level of safety. In nuclear industry, Probabilistic Safety Assessment (PSA) methods are widely used to assess the risk from nuclear power plants. PSA presents a systematic methodology to evaluate the likelihood of various accident scenarios and their possible consequences using fault/event tree techniques. It is proposed to extend similar approach to analyse the risk associated with synchrotron radiation sources. First step for such an analysis is establishing the failure criteria, considering the regulatory stipulations on acceptable limits of dose due to ionization radiation from normal as well as beam loss scenarios. Some possible scenarios considered in this study are (1) excessive Bremsstrahlung in the ring due to loss of vacuum, (2) Target failure due to excessively focused beam (3) mis-directed/mis-steered beam (4) beam loss and sky shine. Hazard analysis needs to cover the beam transfer line, storage ring and experimental beam line areas. Various safety provisions are in place to minimize the hazards from these facilities such as access control interlock systems, radiation shielding for storage ring and beam lines and safety shutters (for beam lines). Experimental beam line area is the most vulnerable locations that need to be critically analysed. There are multiple beam lines, that have different safety provisions and consequences from postulated beam strikes will also be different and this increases the complexity of analysis. Similar studies conducted for such experimental facilities have identified that the radiation safety interlock system, used to control access to areas inside ring and the hutches of beamline facilities has an

  17. Devices, materials, and processes for nano-electronics: characterization with advanced X-ray techniques using lab-based and synchrotron radiation sources

    International Nuclear Information System (INIS)

    Zschech, E.; Wyon, C.; Murray, C.E.; Schneider, G.

    2011-01-01

    Future nano-electronics manufacturing at extraordinary length scales, new device structures, and advanced materials will provide challenges to process development and engineering but also to process control and physical failure analysis. Advanced X-ray techniques, using lab systems and synchrotron radiation sources, will play a key role for the characterization of thin films, nano-structures, surfaces, and interfaces. The development of advanced X-ray techniques and tools will reduce risk and time for the introduction of new technologies. Eventually, time-to-market for new products will be reduced by the timely implementation of the best techniques for process development and process control. The development and use of advanced methods at synchrotron radiation sources will be increasingly important, particularly for research and development in the field of advanced processes and new materials but also for the development of new X-ray components and procedures. The application of advanced X-ray techniques, in-line, in out-of-fab analytical labs and at synchrotron radiation sources, for research, development, and manufacturing in the nano-electronics industry is reviewed. The focus of this paper is on the study of nano-scale device and on-chip interconnect materials, and materials for 3D IC integration as well. (authors)

  18. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  19. Photoacoustic Techniques for Trace Gas Sensing Based on Semiconductor Laser Sources

    Directory of Open Access Journals (Sweden)

    Vincenzo Spagnolo

    2009-12-01

    Full Text Available The paper provides an overview on the use of photoacoustic sensors based on semiconductor laser sources for the detection of trace gases. We review the results obtained using standard, differential and quartz enhanced photoacoustic techniques.

  20. CLASSIFICATION AND RANKING OF FERMI LAT GAMMA-RAY SOURCES FROM THE 3FGL CATALOG USING MACHINE LEARNING TECHNIQUES

    Energy Technology Data Exchange (ETDEWEB)

    Saz Parkinson, P. M. [Department of Physics, The University of Hong Kong, Pokfulam Road, Hong Kong (China); Xu, H.; Yu, P. L. H. [Department of Statistics and Actuarial Science, The University of Hong Kong, Pokfulam Road, Hong Kong (China); Salvetti, D.; Marelli, M. [INAF—Istituto di Astrofisica Spaziale e Fisica Cosmica Milano, via E. Bassini 15, I-20133, Milano (Italy); Falcone, A. D. [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States)

    2016-03-20

    We apply a number of statistical and machine learning techniques to classify and rank gamma-ray sources from the Third Fermi Large Area Telescope Source Catalog (3FGL), according to their likelihood of falling into the two major classes of gamma-ray emitters: pulsars (PSR) or active galactic nuclei (AGNs). Using 1904 3FGL sources that have been identified/associated with AGNs (1738) and PSR (166), we train (using 70% of our sample) and test (using 30%) our algorithms and find that the best overall accuracy (>96%) is obtained with the Random Forest (RF) technique, while using a logistic regression (LR) algorithm results in only marginally lower accuracy. We apply the same techniques on a subsample of 142 known gamma-ray pulsars to classify them into two major subcategories: young (YNG) and millisecond pulsars (MSP). Once more, the RF algorithm has the best overall accuracy (∼90%), while a boosted LR analysis comes a close second. We apply our two best models (RF and LR) to the entire 3FGL catalog, providing predictions on the likely nature of unassociated sources, including the likely type of pulsar (YNG or MSP). We also use our predictions to shed light on the possible nature of some gamma-ray sources with known associations (e.g., binaries, supernova remnants/pulsar wind nebulae). Finally, we provide a list of plausible X-ray counterparts for some pulsar candidates, obtained using Swift, Chandra, and XMM. The results of our study will be of interest both for in-depth follow-up searches (e.g., pulsar) at various wavelengths and for broader population studies.

  1. CLASSIFICATION AND RANKING OF FERMI LAT GAMMA-RAY SOURCES FROM THE 3FGL CATALOG USING MACHINE LEARNING TECHNIQUES

    International Nuclear Information System (INIS)

    Saz Parkinson, P. M.; Xu, H.; Yu, P. L. H.; Salvetti, D.; Marelli, M.; Falcone, A. D.

    2016-01-01

    We apply a number of statistical and machine learning techniques to classify and rank gamma-ray sources from the Third Fermi Large Area Telescope Source Catalog (3FGL), according to their likelihood of falling into the two major classes of gamma-ray emitters: pulsars (PSR) or active galactic nuclei (AGNs). Using 1904 3FGL sources that have been identified/associated with AGNs (1738) and PSR (166), we train (using 70% of our sample) and test (using 30%) our algorithms and find that the best overall accuracy (>96%) is obtained with the Random Forest (RF) technique, while using a logistic regression (LR) algorithm results in only marginally lower accuracy. We apply the same techniques on a subsample of 142 known gamma-ray pulsars to classify them into two major subcategories: young (YNG) and millisecond pulsars (MSP). Once more, the RF algorithm has the best overall accuracy (∼90%), while a boosted LR analysis comes a close second. We apply our two best models (RF and LR) to the entire 3FGL catalog, providing predictions on the likely nature of unassociated sources, including the likely type of pulsar (YNG or MSP). We also use our predictions to shed light on the possible nature of some gamma-ray sources with known associations (e.g., binaries, supernova remnants/pulsar wind nebulae). Finally, we provide a list of plausible X-ray counterparts for some pulsar candidates, obtained using Swift, Chandra, and XMM. The results of our study will be of interest both for in-depth follow-up searches (e.g., pulsar) at various wavelengths and for broader population studies

  2. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  3. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  4. A simple iterative independent component analysis algorithm for vibration source signal identification of complex structures

    Directory of Open Access Journals (Sweden)

    Dong-Sup Lee

    2015-01-01

    Full Text Available Independent Component Analysis (ICA, one of the blind source separation methods, can be applied for extracting unknown source signals only from received signals. This is accomplished by finding statistical independence of signal mixtures and has been successfully applied to myriad fields such as medical science, image processing, and numerous others. Nevertheless, there are inherent problems that have been reported when using this technique: insta- bility and invalid ordering of separated signals, particularly when using a conventional ICA technique in vibratory source signal identification of complex structures. In this study, a simple iterative algorithm of the conventional ICA has been proposed to mitigate these problems. The proposed method to extract more stable source signals having valid order includes an iterative and reordering process of extracted mixing matrix to reconstruct finally converged source signals, referring to the magnitudes of correlation coefficients between the intermediately separated signals and the signals measured on or nearby sources. In order to review the problems of the conventional ICA technique and to vali- date the proposed method, numerical analyses have been carried out for a virtual response model and a 30 m class submarine model. Moreover, in order to investigate applicability of the proposed method to real problem of complex structure, an experiment has been carried out for a scaled submarine mockup. The results show that the proposed method could resolve the inherent problems of a conventional ICA technique.

  5. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  6. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  7. Factors Associated With Healthcare-Acquired Catheter-Associated Urinary Tract Infections: Analysis Using Multiple Data Sources and Data Mining Techniques.

    Science.gov (United States)

    Park, Jung In; Bliss, Donna Z; Chi, Chih-Lin; Delaney, Connie W; Westra, Bonnie L

    The purpose of this study was to identify factors associated with healthcare-acquired catheter-associated urinary tract infections (HA-CAUTIs) using multiple data sources and data mining techniques. Three data sets were integrated for analysis: electronic health record data from a university hospital in the Midwestern United States was combined with staffing and environmental data from the hospital's National Database of Nursing Quality Indicators and a list of patients with HA-CAUTIs. Three data mining techniques were used for identification of factors associated with HA-CAUTI: decision trees, logistic regression, and support vector machines. Fewer total nursing hours per patient-day, lower percentage of direct care RNs with specialty nursing certification, higher percentage of direct care RNs with associate's degree in nursing, and higher percentage of direct care RNs with BSN, MSN, or doctoral degree are associated with HA-CAUTI occurrence. The results also support the association of the following factors with HA-CAUTI identified by previous studies: female gender; older age (>50 years); longer length of stay; severe underlying disease; glucose lab results (>200 mg/dL); longer use of the catheter; and RN staffing. Additional findings from this study demonstrated that the presence of more nurses with specialty nursing certifications can reduce HA-CAUTI occurrence. While there may be valid reasons for leaving in a urinary catheter, findings show that having a catheter in for more than 48 hours contributes to HA-CAUTI occurrence. Finally, the findings suggest that more nursing hours per patient-day are related to better patient outcomes.

  8. Bispectral pairwise interacting source analysis for identifying systems of cross-frequency interacting brain sources from electroencephalographic or magnetoencephalographic signals

    Science.gov (United States)

    Chella, Federico; Pizzella, Vittorio; Zappasodi, Filippo; Nolte, Guido; Marzetti, Laura

    2016-05-01

    Brain cognitive functions arise through the coordinated activity of several brain regions, which actually form complex dynamical systems operating at multiple frequencies. These systems often consist of interacting subsystems, whose characterization is of importance for a complete understanding of the brain interaction processes. To address this issue, we present a technique, namely the bispectral pairwise interacting source analysis (biPISA), for analyzing systems of cross-frequency interacting brain sources when multichannel electroencephalographic (EEG) or magnetoencephalographic (MEG) data are available. Specifically, the biPISA makes it possible to identify one or many subsystems of cross-frequency interacting sources by decomposing the antisymmetric components of the cross-bispectra between EEG or MEG signals, based on the assumption that interactions are pairwise. Thanks to the properties of the antisymmetric components of the cross-bispectra, biPISA is also robust to spurious interactions arising from mixing artifacts, i.e., volume conduction or field spread, which always affect EEG or MEG functional connectivity estimates. This method is an extension of the pairwise interacting source analysis (PISA), which was originally introduced for investigating interactions at the same frequency, to the study of cross-frequency interactions. The effectiveness of this approach is demonstrated in simulations for up to three interacting source pairs and for real MEG recordings of spontaneous brain activity. Simulations show that the performances of biPISA in estimating the phase difference between the interacting sources are affected by the increasing level of noise rather than by the number of the interacting subsystems. The analysis of real MEG data reveals an interaction between two pairs of sources of central mu and beta rhythms, localizing in the proximity of the left and right central sulci.

  9. Distributed source term analysis, a new approach to nuclear material inventory verification

    CERN Document Server

    Beddingfield, D H

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to gamma-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than gamma-rays.

  10. Distributed source term analysis, a new approach to nuclear material inventory verification

    International Nuclear Information System (INIS)

    Beddingfield, D.H.; Menlove, H.O.

    2002-01-01

    The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to γ-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than γ-rays

  11. Incorporating priors for EEG source imaging and connectivity analysis

    Directory of Open Access Journals (Sweden)

    Xu eLei

    2015-08-01

    Full Text Available Electroencephalography source imaging (ESI is a useful technique to localize the generators from a given scalp electric measurement and to investigate the temporal dynamics of the large-scale neural circuits. By introducing reasonable priors from other modalities, ESI reveals the most probable sources and communication structures at every moment in time. Here, we review the available priors from such techniques as magnetic resonance imaging (MRI, functional MRI (fMRI, and positron emission tomography (PET. The modality's specific contribution is analyzed from the perspective of source reconstruction. For spatial priors, such as EEG-correlated fMRI, temporally coherent networks and resting-state fMRI are systematically introduced in the ESI. Moreover, the fiber tracking (diffusion tensor imaging, DTI and neuro-stimulation techniques (transcranial magnetic stimulation, TMS are also introduced as the potential priors, which can help to draw inferences about the neuroelectric connectivity in the source space. We conclude that combining EEG source imaging with other complementary modalities is a promising approach towards the study of brain networks in cognitive and clinical neurosciences.

  12. Near-Field Source Localization by Using Focusing Technique

    Science.gov (United States)

    He, Hongyang; Wang, Yide; Saillard, Joseph

    2008-12-01

    We discuss two fast algorithms to localize multiple sources in near field. The symmetry-based method proposed by Zhi and Chia (2007) is first improved by implementing a search-free procedure for the reduction of computation cost. We present then a focusing-based method which does not require symmetric array configuration. By using focusing technique, the near-field signal model is transformed into a model possessing the same structure as in the far-field situation, which allows the bearing estimation with the well-studied far-field methods. With the estimated bearing, the range estimation of each source is consequently obtained by using 1D MUSIC method without parameter pairing. The performance of the improved symmetry-based method and the proposed focusing-based method is compared by Monte Carlo simulations and with Crammer-Rao bound as well. Unlike other near-field algorithms, these two approaches require neither high-computation cost nor high-order statistics.

  13. Near-Field Source Localization by Using Focusing Technique

    Directory of Open Access Journals (Sweden)

    Joseph Saillard

    2008-12-01

    Full Text Available We discuss two fast algorithms to localize multiple sources in near field. The symmetry-based method proposed by Zhi and Chia (2007 is first improved by implementing a search-free procedure for the reduction of computation cost. We present then a focusing-based method which does not require symmetric array configuration. By using focusing technique, the near-field signal model is transformed into a model possessing the same structure as in the far-field situation, which allows the bearing estimation with the well-studied far-field methods. With the estimated bearing, the range estimation of each source is consequently obtained by using 1D MUSIC method without parameter pairing. The performance of the improved symmetry-based method and the proposed focusing-based method is compared by Monte Carlo simulations and with Crammer-Rao bound as well. Unlike other near-field algorithms, these two approaches require neither high-computation cost nor high-order statistics

  14. The low-frequency sound power measuring technique for an underwater source in a non-anechoic tank

    Science.gov (United States)

    Zhang, Yi-Ming; Tang, Rui; Li, Qi; Shang, Da-Jing

    2018-03-01

    In order to determine the radiated sound power of an underwater source below the Schroeder cut-off frequency in a non-anechoic tank, a low-frequency extension measuring technique is proposed. This technique is based on a unique relationship between the transmission characteristics of the enclosed field and those of the free field, which can be obtained as a correction term based on previous measurements of a known simple source. The radiated sound power of an unknown underwater source in the free field can thereby be obtained accurately from measurements in a non-anechoic tank. To verify the validity of the proposed technique, a mathematical model of the enclosed field is established using normal-mode theory, and the relationship between the transmission characteristics of the enclosed and free fields is obtained. The radiated sound power of an underwater transducer source is tested in a glass tank using the proposed low-frequency extension measuring technique. Compared with the free field, the radiated sound power level of the narrowband spectrum deviation is found to be less than 3 dB, and the 1/3 octave spectrum deviation is found to be less than 1 dB. The proposed testing technique can be used not only to extend the low-frequency applications of non-anechoic tanks, but also for measurement of radiated sound power from complicated sources in non-anechoic tanks.

  15. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  16. Identifying sources of atmospheric fine particles in Havana City using Positive Matrix Factorization technique

    International Nuclear Information System (INIS)

    Pinnera, I.; Perez, G.; Ramos, M.; Guibert, R.; Aldape, F.; Flores M, J.; Martinez, M.; Molina, E.; Fernandez, A.

    2011-01-01

    In previous study a set of samples of fine and coarse airborne particulate matter collected in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique. The concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were consistently determined in both particle sizes. The analytical database provided by PIXE was statistically analyzed in order to determine the local pollution sources. The Positive Matrix Factorization (PMF) technique was applied to fine particle data in order to identify possible pollution sources. These sources were further verified by enrichment factor (EF) calculation. A general discussion about these results is presented in this work. (Author)

  17. Source identification of underground fuel spills in a petroleum refinery using fingerprinting techniques and chemo-metric analysis. A Case Study

    International Nuclear Information System (INIS)

    Kanellopoulou, G.; Gidarakos, E.; Pasadakis, N.

    2005-01-01

    Crude oil and its refining products are the most frequent contaminants, found in the environment due to spills. The aim of this work was the identification of spill source(s) in the subsurface of a petroleum refinery. Free phase samples were analyzed with gas chromatography and the analytical results were interpreted using Principal Component Analysis (PCA) method. The chemical analysis of groundwater samples from the refinery subsurface was also employed to obtain a comprehensive picture of the spill distribution and origin. (authors)

  18. Non destructive multi elemental analysis using prompt gamma neutron activation analysis techniques: Preliminary results for concrete sample

    Energy Technology Data Exchange (ETDEWEB)

    Dahing, Lahasen Normanshah [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor, Malaysia and Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia); Yahya, Redzuan [School of Applied Physics, Universiti Kebangsaan Malaysia, 43600 Bangi, Selangor (Malaysia); Yahya, Roslan; Hassan, Hearie [Malaysian Nuclear Agency (Nuklear Malaysia), Bangi 43000, Kajang (Malaysia)

    2014-09-03

    In this study, principle of prompt gamma neutron activation analysis has been used as a technique to determine the elements in the sample. The system consists of collimated isotopic neutron source, Cf-252 with HPGe detector and Multichannel Analysis (MCA). Concrete with size of 10×10×10 cm{sup 3} and 15×15×15 cm{sup 3} were analysed as sample. When neutrons enter and interact with elements in the concrete, the neutron capture reaction will occur and produce characteristic prompt gamma ray of the elements. The preliminary result of this study demonstrate the major element in the concrete was determined such as Si, Mg, Ca, Al, Fe and H as well as others element, such as Cl by analysis the gamma ray lines respectively. The results obtained were compared with NAA and XRF techniques as a part of reference and validation. The potential and the capability of neutron induced prompt gamma as tool for multi elemental analysis qualitatively to identify the elements present in the concrete sample discussed.

  19. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  20. Artificial lateral-line system for imaging dipole sources using Beamforming techniques

    NARCIS (Netherlands)

    Dagamseh, A.M.K.; Wiegerink, Remco J.; Lammerink, Theodorus S.J.; Krijnen, Gijsbertus J.M.

    In nature, fish have the ability to localize prey, school, navigate, etc. using the lateral-line organ [1]. Here we present the use of biomimetic artificial hair-based flow-sensors arranged as lateral-line system in combination with beamforming techniques for dipole source localization in air.

  1. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  2. A technique to consider mismatches between fMRI and EEG/MEG sources for fMRI-constrained EEG/MEG source imaging: a preliminary simulation study

    International Nuclear Information System (INIS)

    Im, Chang-Hwan; Lee, Soo Yeol

    2006-01-01

    fMRI-constrained EEG/MEG source imaging can be a powerful tool in studying human brain functions with enhanced spatial and temporal resolutions. Recent studies on the combination of fMRI and EEG/MEG have suggested that fMRI prior information could be readily implemented by simply imposing different weighting factors to cortical sources overlapping with the fMRI activations. It has been also reported, however, that such a hard constraint may cause severe distortions or elimination of meaningful EEG/MEG sources when there are distinct mismatches between the fMRI activations and the EEG/MEG sources. If one wants to obtain the actual EEG/MEG source locations and uses the fMRI prior information as just an auxiliary tool to enhance focality of the distributed EEG/MEG sources, it is reasonable to weaken the strength of fMRI constraint when severe mismatches between fMRI and EEG/MEG sources are observed. The present study suggests an efficient technique to automatically adjust the strength of fMRI constraint according to the mismatch level. The use of the proposed technique rarely affects the results of conventional fMRI-constrained EEG/MEG source imaging if no major mismatch between the two modalities is detected; while the new results become similar to those of typical EEG/MEG source imaging without fMRI constraint if the mismatch level is significant. A preliminary simulation study using realistic EEG signals demonstrated that the proposed technique can be a promising tool to selectively apply fMRI prior information to EEG/MEG source imaging

  3. Silver recovery aqueous techniques from diverse sources: Hydrometallurgy in recycling.

    Science.gov (United States)

    Syed, S

    2016-04-01

    The demand of silver is ever increasing with the advance of the industrialized world, whereas worldwide reserves of high grade silver ores are retreating. However, there exist large stashes of low and lean grade silver ores that are yet to be exploited. The main impression of this work was to draw attention to the most advance technologies in silver recovery and recycling from various sources. The state of the art in recovery of silver from different sources by hydrometallurgical and bio-metallurgical processing and varieties of leaching, cementing, reducing agents, peeling, electro-coagulants, adsorbents, electro-dialysis, solvent extraction, ion exchange resins and bio sorbents are highlighted in this article. It is shown that the major economic driver for recycling of depleted sources is for the recovery of silver. In order to develop an nature-friendly technique for the recovery of silver from diverse sources, a critical comparison of existing technologies is analyzed for both economic viability and environmental impact was made in this amendment and silver ion toxicity is highlighted. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. [Demand for and the Development of Detection Techniques for Source of Schistosome Infection in China].

    Science.gov (United States)

    Wang, Shi-ping; He, Xin; Zhou, Yun-fei

    2015-12-01

    Schistosomiasis is a type of zoonotic parasitosis that severely impairs human health. Rapid detection of infection sources is a key to the control of schistosomiasis. With the effective control of schistosomiasis in China, the detection techniques for infection sources have also been developed. The rate and the intensity of infection among humans and livestocks have been significantly decreased in China, as the control program has entered the transmission control stage in most of the endemic areas. Under this situation, the traditional etiological diagnosing techniques and common immunological methods can not afford rapid detection of infection sources of schistosomiasis. Instead, we are calling for detection methods with higher sensitivity, specificity and stability while being less time-consuming, more convenient and less costing. In recent years, many improved or novel detection methods have been applied for the epidemiological surveillance of schistosomiasis, such as the automatic scanning microscopic image acquisition system, PCR-ELISA, immunosensors, loop-mediated isothermal amplification, etc. The development of new monitoring techniques can facilitate rapid detection of schistosome infection sources in endemic areas.

  5. Source reconstruction using phase space beam summation technique

    International Nuclear Information System (INIS)

    Graubart, Gideon.

    1990-10-01

    In this work, the phase-space beam summation technique (PSBS), is applied to back propagation and inverse source problems. The PSBS expresses the field as a superposition of shifted and tilted beams. This phase space spectrum of beams is matched to the source distribution via an amplitude function which expresses the local spectrum of the source function in terms of a local Fourier transform. In this work, the emphasis is on the phase space processing of the data, on the information content of this data and on the back propagation scheme. More work is still required to combine this back propagation approach in a full, multi experiment inverse scattering scheme. It is shown that the phase space distribution of the data, computed via the local spectrum transform, is localized along lines that define the local arrival direction of the wave data. We explore how the choice of the beam width affects the compactification of this distribution, and derive criteria for choosing a window that optimizes this distribution. It should be emphasized that compact distribution implies fewer beams in the back propagation scheme and therefore higher numerical efficiency and better physical insight. Furthermore it is shown how the local information property of the phase space representation can be used to improve the performance of this simple back propagation problem, in particular with regard to axial resolution; the distance to the source can be determined by back propagating only the large angle phase space beams that focus on the source. The information concerning transverse distribution of the source, on the other hand, is contained in the axial phase space region and can therefore be determined by the corresponding back propagating beams. Because of the global nature of the plane waves propagators the conventional plane wave back propagation scheme does not have the same 'focusing' property, and therefore suffers from lack of information localization and axial resolution. The

  6. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  7. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  8. Artificial intelligence search techniques for optimization of the cold source geometry

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1988-01-01

    Most optimization studies of cold neutron sources have concentrated on the numerical prediction or experimental measurement of the cold moderator optimum thickness which produces the largest cold neutron leakage for a given thermal neutron source. Optimizing the geometrical shape of the cold source, however, is a more difficult problem because the optimized quantity, the cold neutron leakage, is an implicit function of the shape which is the unknown in such a study. We draw an analogy between this problem and a state space search, then we use a simple Artificial Intelligence (AI) search technique to determine the optimum cold source shape based on a two-group, r-z diffusion model. We implemented this AI design concept in the computer program AID which consists of two modules, a physical model module and a search module, which can be independently modified, improved, or made more sophisticated. 7 refs., 1 fig

  9. Artificial intelligence search techniques for the optimization of cold source geometry

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1988-01-01

    Most optimization studies of cold neutron sources have concentrated on the numerical prediction or experimental measurement of the cold moderator optimum thickness that produces the largest cold neutron leakage for a given thermal neutron source. Optimizing the geometric shape of the cold source, however, is a more difficult problem because the optimized quantity, the cold neutron leakage, is an implicit function of the shape, which is the unknown in such a study. An analogy is drawn between this problem and a state space search, then a simple artificial intelligence (AI) search technique is used to determine the optimum cold source shape based on a two-group, r-z diffusion model. This AI design concept was implemented in the computer program AID, which consists of two modules, a physical model module, and a search module, which can be independently modified, improved, or made more sophisticated

  10. Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of Space Geodetic Time Series

    Science.gov (United States)

    Gualandi, Adriano; Serpelloni, Enrico; Elina Belardinelli, Maria; Bonafede, Maurizio; Pezzo, Giuseppe; Tolomei, Cristiano

    2015-04-01

    A critical point in the analysis of ground displacement time series, as those measured by modern space geodetic techniques (primarly continuous GPS/GNSS and InSAR) is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies, since PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem. The recovering and separation of the different sources that generate the observed ground deformation is a fundamental task in order to provide a physical meaning to the possible different sources. PCA fails in the BSS problem since it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the displacement time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient deformation signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources

  11. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    Directory of Open Access Journals (Sweden)

    Peeyush Sahay

    2009-10-01

    Full Text Available Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS, cavity ringdown spectroscopy (CRDS, integrated cavity output spectroscopy (ICOS, cavity enhanced absorption spectroscopy (CEAS, cavity leak-out spectroscopy (CALOS, photoacoustic spectroscopy (PAS, quartz-enhanced photoacoustic spectroscopy (QEPAS, and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS. Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis.

  12. Advanced energy sources and conversion techniques. Proceedings of a seminar. Volume 1. [35 papers

    Energy Technology Data Exchange (ETDEWEB)

    None

    1958-11-01

    The Seminar was organized as a series of tutorial presentations and round table discussions on a technical level to implement the following: (a) to identify and explore present and projected needs for energy sources and conversion techniques for military applications; (b) to exchange information on current and planned efforts in these fields; (c) to examine the effect of anticipated scientific and technological advances on these efforts; and (d) to present suggested programs aimed at satisfying the military needs for energy sources and conversion techniques. Volume I contains all of the unclassified papers presented at the Seminar. (W.D.M.)

  13. Advanced techniques for high resolution spectroscopic observations of cosmic gamma-ray sources

    International Nuclear Information System (INIS)

    Matteson, J.L.; Pelling, M.R.; Peterson, L.E.

    1985-08-01

    We describe an advanced gamma-ray spectrometer that is currently in development. It will obtain a sensitivity of -4 ph/cm -2 -sec in a 6 hour balloon observation and uses innovative techniques for background reduction and source imaging

  14. Identifying the sources of produced water in the oil field by isotopic techniques

    International Nuclear Information System (INIS)

    Nguyen Minh Quy; Hoang Long; Le Thi Thu Huong; Luong Van Huan; Vo Thi Tuong Hanh

    2014-01-01

    The objective of this study is to identify the sources of the formation water in the Southwest Su-Tu-Den (STD SW) basement reservoir. To achieve the objective, isotopic techniques along with geochemical analysis for chloride, bromide, strontium dissolved in the water were applied. The isotopic techniques used in this study were the determination of water stable isotopes signatures (δ 2 H and (δ 18 O) and of the 87 Sr/ 86 Sr ratio of strontium in rock cutting sample and that dissolved in the formation water. The obtained results showed that the stable isotopes compositions of water in the Lower Miocene was -3‰ and -23‰ for (δ 18 O and (δ 2 H, respectively indicating the primeval nature of seawater in the reservoir. Meanwhile, the isotopic composition of water in the basement was clustered in a range of alternated freshwater with (δ 18 O and (δ 2 H being -(3-4)‰ and -(54-60)‰, respectively). The strontium isotopes ratio for water in the Lower Miocene reservoir was lower compared to that for water in the basement confirming the different natures of the water in the two reservoirs. The obtained results are assured for the techniques applicability, and it is recommended that studies on identification of the flow-path of the formation water in the STD SW basement reservoir should be continued. (author)

  15. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification.

  16. Final report of the IAEA advisory group meeting on accelerator-based nuclear analytical techniques for characterization and source identification of aerosol particles

    International Nuclear Information System (INIS)

    1995-01-01

    The field of aerosol characterization and source identification covers a wide range of scientific and technical activities in many institutions, in both developed and developing countries. This field includes research and applications on urban air pollution, source apportionment of suspended particulate matter, radioactive aerosol particles, organic compounds carried on particulate matter, elemental characterization of particles, and other areas. The subject of this AGM focused on the use of accelerator-based nuclear analytical techniques for determination of elemental composition of particles (by either bulk or single particle analysis) and the use of accumulated knowledge for source identification

  17. Chromatographic fingerprint similarity analysis for pollutant source identification

    International Nuclear Information System (INIS)

    Xie, Juan-Ping; Ni, Hong-Gang

    2015-01-01

    In the present study, a similarity analysis method was proposed to evaluate the source-sink relationships among environmental media for polybrominated diphenyl ethers (PBDEs), which were taken as the representative contaminants. Chromatographic fingerprint analysis has been widely used in the fields of natural products chemistry and forensic chemistry, but its application to environmental science has been limited. We established a library of various sources of media containing contaminants (e.g., plastics), recognizing that the establishment of a more comprehensive library allows for a better understanding of the sources of contamination. We then compared an environmental complex mixture (e.g., sediment, soil) with the profiles in the library. These comparisons could be used as the first step in source tracking. The cosine similarities between plastic and soil or sediment ranged from 0.53 to 0.68, suggesting that plastic in electronic waste is an important source of PBDEs in the environment, but it is not the only source. A similarity analysis between soil and sediment indicated that they have a source-sink relationship. Generally, the similarity analysis method can encompass more relevant information of complex mixtures in the environment than a profile-based approach that only focuses on target pollutants. There is an inherent advantage to creating a data matrix containing all peaks and their relative levels after matching the peaks based on retention times and peak areas. This data matrix can be used for source identification via a similarity analysis without quantitative or qualitative analysis of all chemicals in a sample. - Highlights: • Chromatographic fingerprint analysis can be used as the first step in source tracking. • Similarity analysis method can encompass more relevant information of pollution. • The fingerprints strongly depend on the chromatographic conditions. • A more effective and robust method for identifying similarities is required

  18. Two-Stage MAS Technique for Analysis of DRA Elements and Arrays on Finite Ground Planes

    DEFF Research Database (Denmark)

    Larsen, Niels Vesterdal; Breinbjerg, Olav

    2007-01-01

    A two-stage Method of Auxiliary Sources (MAS) technique is proposed for analysis of dielectric resonator antenna (DRA) elements and arrays on finite ground planes (FGPs). The problem is solved by first analysing the DRA on an infinite ground plane (IGP) and then using this solution to model the FGP...

  19. A Comparison Study of Sinusoidal PWM and Space Vector PWM Techniques for Voltage Source Inverter

    Directory of Open Access Journals (Sweden)

    Ömer Türksoy

    2017-06-01

    Full Text Available In this paper, the methods used to control voltage source inverters which have been intensively investigated in recent years are compared. Although the most efficient result is obtained with the least number of switching elements in the inverter topologies, the method used in the switching is at least as effective as the topology. Besides, the selected switching method to control the inverter will play an effective role in suppressing harmonic components while producing the ideal output voltage. There are many derivatives of pulse width modulation techniques that are commonly used to control voltage source inverters. Some of widespread methods are sinusoidal pulse width modulation and space vector pulse width modulation techniques. These modulation techniques used for generating variable frequency and amplitude output voltage in voltage source inverters, have been simulated by using MATLAB/SIMULINK. And, the total harmonic distortions of the output voltages are compared. As a result of simulation studies, sinusoidal pulse width modulation has been found to have more total harmonic distortion in output voltages of voltage source inverters in the simulation. Space vector pulse width modulation has been shown to produce a more efficient output voltage with less total harmonic distortion.

  20. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  1. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  2. Detection and Characterization of Ground Displacement Sources from Variational Bayesian Independent Component Analysis of GPS Time Series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2014-12-01

    A critical point in the analysis of ground displacements time series is the development of data driven methods that allow to discern and characterize the different sources that generate the observed displacements. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows to reduce the dimensionality of the data space maintaining most of the variance of the dataset explained. It reproduces the original data using a limited number of Principal Components, but it also shows some deficiencies. Indeed, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. Usually, the uncorrelation condition is not strong enough and it has been proven that the BSS problem can be tackled imposing on the components to be independent. The Independent Component Analysis (ICA) is, in fact, another popular technique adopted to approach this problem, and it can be used in all those fields where PCA is also applied. An ICA approach enables us to explain the time series imposing a fewer number of constraints on the model, and to reveal anomalies in the data such as transient signals. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here we present the application of the vbICA technique to GPS position time series. First, we use vbICA on synthetic data that simulate a seismic cycle

  3. Analysis of Surface Water Pollution in the Kinta River Using Multivariate Technique

    International Nuclear Information System (INIS)

    Hamza Ahmad Isiyaka; Hafizan Juahir

    2015-01-01

    This study aims to investigate the spatial variation in the characteristics of water quality monitoring sites, identify the most significant parameters and the major possible sources of pollution, and apportion the source category in the Kinta River. 31 parameters collected from eight monitoring sites for eight years (2006-2013) were employed. The eight monitoring stations were spatially grouped into three independent clusters in a dendrogram. A drastic reduction in the number of monitored parameters from 31 to eight and nine significant parameters (P<0.05) was achieved using the forward stepwise and backward stepwise discriminate analysis (DA). Principal component analysis (PCA) accounted for more than 76 % in the total variance and attributes the source of pollution to anthropogenic and natural processes. The source apportionment using a combined multiple linear regression and principal component scores indicates that 41 % of the total pollution load is from rock weathering and untreated waste water, 26 % from waste discharge, 24 % from surface runoff and 7 % from faecal waste. This study proposes a reduction in the number of monitoring stations and parameters for a cost effective and time management in the monitoring processes and multivariate technique can provide a simple representation of complex and dynamic water quality characteristics. (author)

  4. Carbon dioxide capture and separation techniques for advanced power generation point sources

    Energy Technology Data Exchange (ETDEWEB)

    Pennline, H.W.; Luebke, D.R.; Morsi, B.I.; Heintz, Y.J.; Jones, K.L.; Ilconich, J.B.

    2006-09-01

    The capture/separation step for carbon dioxide (CO2) from large-point sources is a critical one with respect to the technical feasibility and cost of the overall carbon sequestration scenario. For large-point sources, such as those found in power generation, the carbon dioxide capture techniques being investigated by the in-house research area of the National Energy Technology Laboratory possess the potential for improved efficiency and costs as compared to more conventional technologies. The investigated techniques can have wide applications, but the research has focused on capture/separation of carbon dioxide from flue gas (postcombustion from fossil fuel-fired combustors) and from fuel gas (precombustion, such as integrated gasification combined cycle – IGCC). With respect to fuel gas applications, novel concepts are being developed in wet scrubbing with physical absorption; chemical absorption with solid sorbents; and separation by membranes. In one concept, a wet scrubbing technique is being investigated that uses a physical solvent process to remove CO2 from fuel gas of an IGCC system at elevated temperature and pressure. The need to define an ideal solvent has led to the study of the solubility and mass transfer properties of various solvents. Fabrication techniques and mechanistic studies for hybrid membranes separating CO2 from the fuel gas produced by coal gasification are also being performed. Membranes that consist of CO2-philic silanes incorporated into an alumina support or ionic liquids encapsulated into a polymeric substrate have been investigated for permeability and selectivity. An overview of two novel techniques is presented along with a research progress status of each technology.

  5. PIXE Analysis and source identification of airborne particulate matter collected in Downtown Havana City

    International Nuclear Information System (INIS)

    Perez, G.; Pinnera, I; Ramos, M; Guibert, R; Molina, E.; Martinez, M.; Fernandez, A.; Aldape, F.; Flores, M.

    2009-01-01

    A set of samples containing airborne particulate matter (in two particle size fraction PM10 and PM2,5) collected during five months from November 2006 to April 2007 in a urban area of Havana City were analyzed by Particle-Induced X-ray Emission (PIXE) technique and the concentrations of 14 elements (S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Ni, Cu, Zn, Br and Pb) were determined consistently in both particle size fractions with minimum detection limits in the range of ng/m3. A Gent air sampler was used for the aerosol collection in PM10 and PM2,5 particles simultaneously and the PIXE elemental analysis were performed using a proton beam of 2.5 MeV from the 2 MV Van de Graff Tandetron Accelerator at the ININ PIXE Laboratory in Mexico. The analytical database provided by PIXE was statistically analyzed in order to determine the promising local pollution sources. The statistical techniques of Multivariate Factor Analysis in combination with the Principal Component Analysis methods were applied to this data and allowed identifying five main pollution sources of airborne particulate matter (PM10 and PM2,5) collected in this area. The main (local) identified sources were: soil dust, sea spray, industry, fossil fuel combustion from motor vehicles and burnings or incinerations of diverse materials. A general discussion about these results is presented in this work. (Author)

  6. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  7. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  8. Novel techniques for characterization of hydrocarbon emission sources in the Barnett Shale

    Science.gov (United States)

    Nathan, Brian Joseph

    Changes in ambient atmospheric hydrocarbon concentrations can have both short-term and long-term effects on the atmosphere and on human health. Thus, accurate characterization of emissions sources is critically important. The recent boom in shale gas production has led to an increase in hydrocarbon emissions from associated processes, though the exact extent is uncertain. As an original quantification technique, a model airplane equipped with a specially-designed, open-path methane sensor was flown multiple times over a natural gas compressor station in the Barnett Shale in October 2013. A linear optimization was introduced to a standard Gaussian plume model in an effort to determine the most probable emission rate coming from the station. This is shown to be a suitable approach given an ideal source with a single, central plume. Separately, an analysis was performed to characterize the nonmethane hydrocarbons in the Barnett during the same period. Starting with ambient hourly concentration measurements of forty-six hydrocarbon species, Lagrangian air parcel trajectories were implemented in a meteorological model to extend the resolution of these measurements and achieve domain-fillings of the region for the period of interest. A self-organizing map (a type of unsupervised classification) was then utilized to reduce the dimensionality of the total multivariate set of grids into characteristic one-dimensional signatures. By also introducing a self-organizing map classification of the contemporary wind measurements, the spatial hydrocarbon characterizations are analyzed for periods with similar wind conditions. The accuracy of the classification is verified through assessment of observed spatial mixing ratio enhancements of key species, through site-comparisons with a related long-term study, and through a random forest analysis (an ensemble learning method of supervised classification) to determine the most important species for defining key classes. The hydrocarbon

  9. Water quality assessment and apportionment of pollution sources of Gomti river (India) using multivariate statistical techniques--a case study

    International Nuclear Information System (INIS)

    Singh, Kunwar P.; Malik, Amrita; Sinha, Sarita

    2005-01-01

    Multivariate statistical techniques, such as cluster analysis (CA), factor analysis (FA), principal component analysis (PCA) and discriminant analysis (DA) were applied to the data set on water quality of the Gomti river (India), generated during three years (1999-2001) monitoring at eight different sites for 34 parameters (9792 observations). This study presents usefulness of multivariate statistical techniques for evaluation and interpretation of large complex water quality data sets and apportionment of pollution sources/factors with a view to get better information about the water quality and design of monitoring network for effective management of water resources. Three significant groups, upper catchments (UC), middle catchments (MC) and lower catchments (LC) of sampling sites were obtained through CA on the basis of similarity between them. FA/PCA applied to the data sets pertaining to three catchments regions of the river resulted in seven, seven and six latent factors, respectively responsible for the data structure, explaining 74.3, 73.6 and 81.4% of the total variance of the respective data sets. These included the trace metals group (leaching from soil and industrial waste disposal sites), organic pollution group (municipal and industrial effluents), nutrients group (agricultural runoff), alkalinity, hardness, EC and solids (soil leaching and runoff process). DA showed the best results for data reduction and pattern recognition during both temporal and spatial analysis. It rendered five parameters (temperature, total alkalinity, Cl, Na and K) affording more than 94% right assignations in temporal analysis, while 10 parameters (river discharge, pH, BOD, Cl, F, PO 4 , NH 4 -N, NO 3 -N, TKN and Zn) to afford 97% right assignations in spatial analysis of three different regions in the basin. Thus, DA allowed reduction in dimensionality of the large data set, delineating a few indicator parameters responsible for large variations in water quality. Further

  10. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  11. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  12. Detection, Source Location, and Analysis of Volcano Infrasound

    Science.gov (United States)

    McKee, Kathleen F.

    The study of volcano infrasound focuses on low frequency sound from volcanoes, how volcanic processes produce it, and the path it travels from the source to our receivers. In this dissertation we focus on detecting, locating, and analyzing infrasound from a number of different volcanoes using a variety of analysis techniques. These works will help inform future volcano monitoring using infrasound with respect to infrasonic source location, signal characterization, volatile flux estimation, and back-azimuth to source determination. Source location is an important component of the study of volcano infrasound and in its application to volcano monitoring. Semblance is a forward grid search technique and common source location method in infrasound studies as well as seismology. We evaluated the effectiveness of semblance in the presence of significant topographic features for explosions of Sakurajima Volcano, Japan, while taking into account temperature and wind variations. We show that topographic obstacles at Sakurajima cause a semblance source location offset of 360-420 m to the northeast of the actual source location. In addition, we found despite the consistent offset in source location semblance can still be a useful tool for determining periods of volcanic activity. Infrasonic signal characterization follows signal detection and source location in volcano monitoring in that it informs us of the type of volcanic activity detected. In large volcanic eruptions the lowermost portion of the eruption column is momentum-driven and termed the volcanic jet or gas-thrust zone. This turbulent fluid-flow perturbs the atmosphere and produces a sound similar to that of jet and rocket engines, known as jet noise. We deployed an array of infrasound sensors near an accessible, less hazardous, fumarolic jet at Aso Volcano, Japan as an analogue to large, violent volcanic eruption jets. We recorded volcanic jet noise at 57.6° from vertical, a recording angle not normally feasible

  13. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  14. LED intense headband light source for fingerprint analysis

    Science.gov (United States)

    Villa-Aleman, Eliel

    2005-03-08

    A portable, lightweight and high-intensity light source for detecting and analyzing fingerprints during field investigation. On-site field analysis requires long hours of mobile analysis. In one embodiment, the present invention comprises a plurality of light emitting diodes; a power source; and a personal attachment means; wherein the light emitting diodes are powered by the power source, and wherein the power source and the light emitting diodes are attached to the personal attachment means to produce a personal light source for on-site analysis of latent fingerprints. The present invention is available for other applications as well.

  15. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  16. Measuring Trace Gas Emission from Multi-Distributed Sources Using Vertical Radial Plume Mapping (VRPM and Backward Lagrangian Stochastic (bLS Techniques

    Directory of Open Access Journals (Sweden)

    Thomas K. Flesch

    2011-09-01

    Full Text Available Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The vertical radial plume mapping (VRPM and the backward Lagrangian stochastic (bLS techniques with an open-path optical spectroscopic sensor were evaluated for relative accuracy for multiple emission-source and sensor configurations. The relative accuracy was calculated by dividing the measured emission rate by the actual emission rate; thus, a relative accuracy of 1.0 represents a perfect measure. For a single area emission source, the VRPM technique yielded a somewhat high relative accuracy of 1.38 ± 0.28. The bLS technique resulted in a relative accuracy close to unity, 0.98 ± 0.24. Relative accuracies for dual source emissions for the VRPM and bLS techniques were somewhat similar to single source emissions, 1.23 ± 0.17 and 0.94 ± 0.24, respectively. When the bLS technique was used with vertical point concentrations, the relative accuracy was unacceptably low,

  17. Technique for measuring cooling patterns in ion source grids by infrared scanning

    International Nuclear Information System (INIS)

    Grisham, L.R.; Eubank, H.P.; Kugel, H.W.

    1980-02-01

    Many plasma sources designed for neutral beam injection heating of plasmas now employ copper beam acceleration grids which are water-cooled by small capillary tubes fed from one or more headers. To prevent thermally-induced warpage of these grids it is essential that one be able to detect inhomogeneities in the cooling. Due to the very strong thermal coupling between adjacent cooling lines and the concomitant rapid equilibration times, it is not practical to make such measurements in a direct manner with a contact thermometer. We have developed a technique whereby we send a burst of hot water through an initially cool grid, followed by a burst of cool water, and record the transient thermal behavior usng an infrared television camera. This technique, which would be useful for any system with cooling paths that are strongly coupled thermally, has been applied to a number of sources built for the PLT and PDX tokamaks, and has proven highly effective in locating cooling deficiencies and blocked capillary tubes

  18. Detection and monitoring of pollutant sources with Lidar/Dial techniques

    International Nuclear Information System (INIS)

    Gaudio, P; Gelfusa, M; Malizia, A; Parracino, S; Richetta, M; De Leo, L; Perrimezzi, C; Bellecci, C

    2015-01-01

    It's well known that air pollution due to anthropogenic sources can have adverse effects on humans and the ecosystem. Therefore, in the last years, surveying large regions of the atmosphere in an automatic way has become a strategic objective of various public health organizations for early detection of pollutant sources in urban and industrial areas.The Lidar and Dial techniques have become well established laser based methods for the remote sensing of the atmosphere. They are often implemented to probe almost any level of the atmosphere and to acquire information to validate theoretical models about different topics of atmospheric physics. They can also be used for environment surveying by monitoring particles, aerosols and molecules.The aim of the present work is to demonstrate the potential of these methods to detect pollutants emitted from local sources (such as particulate and/or chemical compounds) and to evaluate their concentration. This is exemplified with the help of experimental data acquired in an industrial area in the south of Italy by mean of experimental campaign by use of pollutants simulated source. For this purpose, two mobile systems Lidar and Dial have been developed by the authors. In this paper there will be presented the operating principles of the system and the results of the experimental campaign. (paper)

  19. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  20. An alternative technique for simulating volumetric cylindrical sources in the Morse code utilization

    International Nuclear Information System (INIS)

    Vieira, W.J.; Mendonca, A.G.

    1985-01-01

    In the solution of deep-penetration problems using the Monte Carlo method, calculation techniques and strategies are used in order to increase the particle population in the regions of interest. A common procedure is the coupling of bidimensional calculations, with (r,z) discrete ordinates transformed into source data, and tridimensional Monte Carlo calculations. An alternative technique for this procedure is presented. This alternative proved effective when applied to a sample problem. (F.E.) [pt

  1. Nuclear microprobe analysis and source apportionment of individual atmospheric aerosol particles

    International Nuclear Information System (INIS)

    Artaxo, P.; Rabello, M.L.C.; Watt, F.; Grime, G.; Swietlicki, E.

    1993-01-01

    In atmospheric aerosol reserach, one key issue is to determine the sources of the airborne particles. Bulk PIXE analysis coupled with receptor modeling provides a useful, but limited view of the aerosol sources influencing one particular site or sample. The scanning nuclear microprobe (SNM) technique is a microanalytical technique that gives unique information on individual aerosol particles. In the SNM analyses a 1.0 μm size 2.4 MeV proton beam from the Oxford SNM was used. The trace elements with Z>11 were measured by the particle induced X-ray emission (PIXE) method with detection limits in the 1-10 ppm range. Carbon, nitrogen and oxygen are measured simultaneously using Rutherford backscattering spectrometry (RBS). Atmospheric aerosol particles were collected at the Brazilian Antarctic Station and at biomass burning sites in the Amazon basin tropical rain forest in Brazil. In the Antarctic samples, the sea-salt aerosol particles were clearly predominating, with NaCl and CaSO 4 as major compounds with several trace elements as Al, Si, P, K, Mn, Fe, Ni, Cu, Zn, Br, Sr, and Pb. Factor analysis of the elemental data showed the presence of four components: 1) Soil dust particles; 2) NaCl particles; 3) CaSO 4 with Sr; and 4) Br and Mg. Strontium, observed at 20-100 ppm levels, was always present in the CaSO 4 particles. The hierarchical cluster procedure gave results similar to the ones obtained through factor analysis. For the tropical rain forest biomass burning aerosol emissions, biogenic particles with a high organic content dominate the particle population, while K, P, Ca, Mg, Zn, and Si are the dominant elements. Zinc at 10-200 ppm is present in biogenic particles rich in P and K. The quantitative aspects and excellent detection limits make SNM analysis of individual aerosol particles a very powerful analytical tool. (orig.)

  2. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  3. Adaptive distributed source coding.

    Science.gov (United States)

    Varodayan, David; Lin, Yao-Chung; Girod, Bernd

    2012-05-01

    We consider distributed source coding in the presence of hidden variables that parameterize the statistical dependence among sources. We derive the Slepian-Wolf bound and devise coding algorithms for a block-candidate model of this problem. The encoder sends, in addition to syndrome bits, a portion of the source to the decoder uncoded as doping bits. The decoder uses the sum-product algorithm to simultaneously recover the source symbols and the hidden statistical dependence variables. We also develop novel techniques based on density evolution (DE) to analyze the coding algorithms. We experimentally confirm that our DE analysis closely approximates practical performance. This result allows us to efficiently optimize parameters of the algorithms. In particular, we show that the system performs close to the Slepian-Wolf bound when an appropriate doping rate is selected. We then apply our coding and analysis techniques to a reduced-reference video quality monitoring system and show a bit rate saving of about 75% compared with fixed-length coding.

  4. Techniques of production and analysis of polarized synchrotron radiation

    International Nuclear Information System (INIS)

    Mills, D.M.

    1992-01-01

    The use of the unique polarization properties of synchrotron radiation in the hard x-ray spectral region (E>3 KeV) is becoming increasingly important to many synchrotron radiation researchers. The radiation emitted from bending magnets and conventional (planar) insertion devices (IDs) is highly linearly polarized in the plane of the particle's orbit. Elliptically polarized x-rays can also be obtained by going off axis on a bending magnet source, albeit with considerable loss of flux. The polarization properties of synchrotron radiation can be further tailored to the researcher's specific needs through the use of specialized insertion devices such as helical and crossed undulators and asymmetrical wigglers. Even with the possibility of producing a specific polarization, there is still the need to develop x-ray optical components which can manipulate the polarization for both analysis and further modification of the polarization state. A survey of techniques for producing and analyzing both linear and circular polarized x-rays will be presented with emphasis on those techniques which rely on single crystal optical components

  5. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  6. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  7. Estimating photometric redshifts for X-ray sources in the X-ATLAS field using machine-learning techniques

    Science.gov (United States)

    Mountrichas, G.; Corral, A.; Masoura, V. A.; Georgantopoulos, I.; Ruiz, A.; Georgakakis, A.; Carrera, F. J.; Fotopoulou, S.

    2017-12-01

    We present photometric redshifts for 1031 X-ray sources in the X-ATLAS field using the machine-learning technique TPZ. X-ATLAS covers 7.1 deg2 observed with XMM-Newton within the Science Demonstration Phase of the H-ATLAS field, making it one of the largest contiguous areas of the sky with both XMM-Newton and Herschel coverage. All of the sources have available SDSS photometry, while 810 additionally have mid-IR and/or near-IR photometry. A spectroscopic sample of 5157 sources primarily in the XMM/XXL field, but also from several X-ray surveys and the SDSS DR13 redshift catalogue, was used to train the algorithm. Our analysis reveals that the algorithm performs best when the sources are split, based on their optical morphology, into point-like and extended sources. Optical photometry alone is not enough to estimate accurate photometric redshifts, but the results greatly improve when at least mid-IR photometry is added in the training process. In particular, our measurements show that the estimated photometric redshifts for the X-ray sources of the training sample have a normalized absolute median deviation, nmad ≈ 0.06, and a percentage of outliers, η = 10-14%, depending upon whether the sources are extended or point like. Our final catalogue contains photometric redshifts for 933 out of the 1031 X-ray sources with a median redshift of 0.9. The table of the photometric redshifts is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/608/A39

  8. Risk analysis of geothermal power plants using Failure Modes and Effects Analysis (FMEA) technique

    International Nuclear Information System (INIS)

    Feili, Hamid Reza; Akar, Navid; Lotfizadeh, Hossein; Bairampour, Mohammad; Nasiri, Sina

    2013-01-01

    Highlights: • Using Failure Modes and Effects Analysis (FMEA) to find potential failures in geothermal power plants. • We considered 5 major parts of geothermal power plants for risk analysis. • Risk Priority Number (RPN) is calculated for all failure modes. • Corrective actions are recommended to eliminate or decrease the risk of failure modes. - Abstract: Renewable energy plays a key role in the transition toward a low carbon economy and the provision of a secure supply of energy. Geothermal energy is a versatile source as a form of renewable energy that meets popular demand. Since some Geothermal Power Plants (GPPs) face various failures, the requirement of a technique for team engineering to eliminate or decrease potential failures is considerable. Because no specific published record of considering an FMEA applied to GPPs with common failure modes have been found already, in this paper, the utilization of Failure Modes and Effects Analysis (FMEA) as a convenient technique for determining, classifying and analyzing common failures in typical GPPs is considered. As a result, an appropriate risk scoring of occurrence, detection and severity of failure modes and computing the Risk Priority Number (RPN) for detecting high potential failures is achieved. In order to expedite accuracy and ability to analyze the process, XFMEA software is utilized. Moreover, 5 major parts of a GPP is studied to propose a suitable approach for developing GPPs and increasing reliability by recommending corrective actions for each failure mode

  9. Impedance Source Power Electronic Converters

    DEFF Research Database (Denmark)

    Liu, Yushan; Abu-Rub, Haitham; Ge, Baoming

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable...... and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key...... features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding...

  10. Recent advances in the instrumental techniques for the analysis of modern materials (II)

    International Nuclear Information System (INIS)

    Ahmed, M.

    1990-01-01

    Inductively Coupled Plasma Mass Spectrometry ICP-MS a logical development of equally established sister technique of ICP-AEA discussed in part-1 of this series of article on modern analytical techniques. The rapid adaptation of argon plasma as ion source for time of flight quadrupole mass analyser has led to the development of truly integrated instrumental technique for analysis of solutions and slurries. The powerful combination with laser ablation device has made the direct analysis of geological, geochemical and other complex conducting and non conducting samples possible in days rather months at sub ppm levels. Parallel development in computer hardware and software has made the instrumental optimization easy enabling the generation of meaningful analytical data a matter of routine. The limitations imposed by spectroscopic and non restricted the variety of matrices and materials covered by ICP-MS of LA-ICP-MS. The technique has provided it formidable analytical power in wide areas of industrial environmental, social, biological and break through advanced materials used in space mass communication, transportation and general areas of advanced analytical chemistry. It is expected that in combination with other instrumental methods as HPLC, ETC, ion chromatography. ICP-MS shall continue to dominate well into the 21st century. (author)

  11. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  12. Comparison of Estimation Techniques for Vibro-Acoustic Transfer Path Analysis

    Directory of Open Access Journals (Sweden)

    Paulo Eduardo França Padilha

    2006-01-01

    Full Text Available Vibro-acoustic Transfer Path Analysis (TPA is a tool to evaluate the contribution of different energy propagation paths between a source and a receiver, linked to each other by a number of connections. TPA is typically used to quantify and rank the relative importance of these paths in a given frequency band, determining the most significant one to the receiver. Basically, two quantities have to be determined for TPA: the operational forces at each transfer path and the Frequency Response Functions (FRF of these paths. The FRF are obtained either experimentally or analytically, and the influence of the mechanical impedance of the source can be taken into account or not. The operational forces can be directly obtained from measurements using force transducers or indirectly estimated from auxiliary response measurements. Two methods to obtain the operational forces indirectly – the Complex Stiffness Method (CSM and the Matrix Inversion Method (MIM – associated with two possible configurations to determine the FRF – including and excluding the source impedance – are presented and discussed in this paper. The effect of weak and strong coupling among the paths is also commented considering the techniques previously presented. The main conclusion is that, with the source removed, CSM gives more accurate results. On the other hand, with the source present, MIM is preferable. In the latter case, CSM should be used only if there is a high impedance mismatch between the source and the receiver. Both methods are not affected by a higher or lower degree of coupling among the transfer paths.

  13. Comparative study of CdTe sources used for deposition of CdTe thin films by close spaced sublimation technique

    Directory of Open Access Journals (Sweden)

    Wagner Anacleto Pinheiro

    2006-03-01

    Full Text Available Unlike other thin film deposition techniques, close spaced sublimation (CSS requires a short source-substrate distance. The kind of source used in this technique strongly affects the control of the deposition parameters, especially the deposition rate. When depositing CdTe thin films by CSS, the most common CdTe sources are: single-crystal or polycrystalline wafers, powders, pellets or pieces, a thick CdTe film deposited onto glass or molybdenum substrate (CdTe source-plate and a sintered CdTe powder. In this work, CdTe thin films were deposited by CSS technique from different CdTe sources: particles, powder, compact powder, a paste made of CdTe and propylene glycol and source-plates (CdTe/Mo and CdTe/glass. The largest deposition rate was achieved when a paste made of CdTe and propylene glycol was used as the source. CdTe source-plates led to lower rates, probably due to the poor heat transmission, caused by the introduction of the plate substrate. The results also showed that compacting the powder the deposition rate increases due to the better thermal contact between powder particles.

  14. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  15. Operational techniques employed for the liquid sodium source term control loops

    International Nuclear Information System (INIS)

    Chulos, L.E.

    1976-01-01

    Four Source Term Control Loops (STCLs) have been designed, constructed, and placed into operation at the Hanford Engineering Development Laboratory (HEDL) as part of the Radioactivity Control Technology program. The data obtained are used to determine the corrosion and deposition of LMFBR materials, including corrosion product radionuclides, in a non-isothermal flowing sodium system. The paper discusses operation of the STCL Facilities and, in particular, the methods used for controlling the oxygen content of the liquid sodium. These methods include cold trapping techniques, hot trapping, seeding the cold traps with sodium oxide, and precipitating the oxygen in the cold trap in a controlled manner. Operational problems encountered with the STCL Facilities and the techniques for correcting these problems are also discussed

  16. PyLDM - An open source package for lifetime density analysis of time-resolved spectroscopic data.

    Directory of Open Access Journals (Sweden)

    Gabriel F Dorlhiac

    2017-05-01

    Full Text Available Ultrafast spectroscopy offers temporal resolution for probing processes in the femto- and picosecond regimes. This has allowed for investigation of energy and charge transfer in numerous photoactive compounds and complexes. However, analysis of the resultant data can be complicated, particularly in more complex biological systems, such as photosystems. Historically, the dual approach of global analysis and target modelling has been used to elucidate kinetic descriptions of the system, and the identity of transient species respectively. With regards to the former, the technique of lifetime density analysis (LDA offers an appealing alternative. While global analysis approximates the data to the sum of a small number of exponential decays, typically on the order of 2-4, LDA uses a semi-continuous distribution of 100 lifetimes. This allows for the elucidation of lifetime distributions, which may be expected from investigation of complex systems with many chromophores, as opposed to averages. Furthermore, the inherent assumption of linear combinations of decays in global analysis means the technique is unable to describe dynamic motion, a process which is resolvable with LDA. The technique was introduced to the field of photosynthesis over a decade ago by the Holzwarth group. The analysis has been demonstrated to be an important tool to evaluate complex dynamics such as photosynthetic energy transfer, and complements traditional global and target analysis techniques. Although theory has been well described, no open source code has so far been available to perform lifetime density analysis. Therefore, we introduce a python (2.7 based package, PyLDM, to address this need. We furthermore provide a direct comparison of the capabilities of LDA with those of the more familiar global analysis, as well as providing a number of statistical techniques for dealing with the regularization of noisy data.

  17. Carbon Dioxide Capture and Separation Techniques for Gasification-based Power Generation Point Sources

    Energy Technology Data Exchange (ETDEWEB)

    Pennline, H.W.; Luebke, D.R.; Jones, K.L.; Morsi, B.I. (Univ. of Pittsburgh, PA); Heintz, Y.J. (Univ. of Pittsburgh, PA); Ilconich, J.B. (Parsons)

    2007-06-01

    The capture/separation step for carbon dioxide (CO2) from large-point sources is a critical one with respect to the technical feasibility and cost of the overall carbon sequestration scenario. For large-point sources, such as those found in power generation, the carbon dioxide capture techniques being investigated by the in-house research area of the National Energy Technology Laboratory possess the potential for improved efficiency and reduced costs as compared to more conventional technologies. The investigated techniques can have wide applications, but the research has focused on capture/separation of carbon dioxide from flue gas (post-combustion from fossil fuel-fired combustors) and from fuel gas (precombustion, such as integrated gasification combined cycle or IGCC). With respect to fuel gas applications, novel concepts are being developed in wet scrubbing with physical absorption; chemical absorption with solid sorbents; and separation by membranes. In one concept, a wet scrubbing technique is being investigated that uses a physical solvent process to remove CO2 from fuel gas of an IGCC system at elevated temperature and pressure. The need to define an ideal solvent has led to the study of the solubility and mass transfer properties of various solvents. Pertaining to another separation technology, fabrication techniques and mechanistic studies for membranes separating CO2 from the fuel gas produced by coal gasification are also being performed. Membranes that consist of CO2-philic ionic liquids encapsulated into a polymeric substrate have been investigated for permeability and selectivity. Finally, dry, regenerable processes based on sorbents are additional techniques for CO2 capture from fuel gas. An overview of these novel techniques is presented along with a research progress status of technologies related to membranes and physical solvents.

  18. New modes of particle accelerations techniques and sources. Formal report

    Energy Technology Data Exchange (ETDEWEB)

    Parsa, Z. [ed.

    1996-12-31

    This Report includes copies of transparencies and notes from the presentations made at the Symposium on New Modes of Particle Accelerations - Techniques and Sources, August 19-23, 1996 at the Institute for Theoretical Physics, University of California, Santa Barbara California, that was made available by the authors. Editing, reduction and changes to the authors contributions were made only to fulfill the printing and publication requirements. We would like to take this opportunity and thank the speakers for their informative presentations and for providing copies of their transparencies and notes for inclusion in this Report.

  19. New modes of particle accelerations techniques and sources. Formal report

    International Nuclear Information System (INIS)

    Parsa, Z.

    1996-01-01

    This Report includes copies of transparencies and notes from the presentations made at the Symposium on New Modes of Particle Accelerations - Techniques and Sources, August 19-23, 1996 at the Institute for Theoretical Physics, University of California, Santa Barbara California, that was made available by the authors. Editing, reduction and changes to the authors contributions were made only to fulfill the printing and publication requirements. We would like to take this opportunity and thank the speakers for their informative presentations and for providing copies of their transparencies and notes for inclusion in this Report

  20. Process sensors characterization based on noise analysis technique and artificial intelligence

    International Nuclear Information System (INIS)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos

    2005-01-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  1. Process sensors characterization based on noise analysis technique and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: rnavarro@ipen.br; sperillo@ipen.br; rcsantos@ipen.br

    2005-07-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  2. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  3. Ion source techniques for high-speed processing of material surface by ion beams

    International Nuclear Information System (INIS)

    Ishikawa, Junzo

    1990-01-01

    The present paper discusses some key or candidate techniques for future ion source development and such ion sources developed by the author. Several types of microwave ion sources for producing low charge state ions have been developed in Japan. When a microwave plasma cathode developed by the author is adapted to a Kaufman type ion source, the electron emission currents are found to be 2.5 A for argon gas and 0.5-0.9 A for oxygen gas. An alternative ionization method for metal atoms is strongly required for high-speed processing of material surface by metal-ion beams. Detailed discussion is made of collisional ionization of vaporized atoms, and negative-ion production (secondary negative-ion emission by sputtering). An impregnated electrode type liquid-metal ion source developed by the author, which has a porous tip structure, is described. The negative-ion production efficiency is quite high. The report also presents a neutral and ionized alkaline-metal bombardment type heavy negative-ion source, which consists of a cesium plasma ion source, suppressor, target electrode, negative-ion extraction electrode, and einzel lens. (N.K.)

  4. Anomaly metrics to differentiate threat sources from benign sources in primary vehicle screening.

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, Israel Dov; Mengesha, Wondwosen

    2011-09-01

    Discrimination of benign sources from threat sources at Port of Entries (POE) is of a great importance in efficient screening of cargo and vehicles using Radiation Portal Monitors (RPM). Currently RPM's ability to distinguish these radiological sources is seriously hampered by the energy resolution of the deployed RPMs. As naturally occurring radioactive materials (NORM) are ubiquitous in commerce, false alarms are problematic as they require additional resources in secondary inspection in addition to impacts on commerce. To increase the sensitivity of such detection systems without increasing false alarm rates, alarm metrics need to incorporate the ability to distinguish benign and threat sources. Principal component analysis (PCA) and clustering technique were implemented in the present study. Such techniques were investigated for their potential to lower false alarm rates and/or increase sensitivity to weaker threat sources without loss of specificity. Results of the investigation demonstrated improved sensitivity and specificity in discriminating benign sources from threat sources.

  5. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  6. Efficacy of Blood Sources and Artificial Blood Feeding Methods in Rearing of Aedes aegypti (Diptera: Culicidae) for Sterile Insect Technique and Incompatible Insect Technique Approaches in Sri Lanka

    OpenAIRE

    Nayana Gunathilaka; Tharaka Ranathunge; Lahiru Udayanga; Wimaladharma Abeyewickreme

    2017-01-01

    Introduction Selection of the artificial membrane feeding technique and blood meal source has been recognized as key considerations in mass rearing of vectors. Methodology Artificial membrane feeding techniques, namely, glass plate, metal plate, and Hemotek membrane feeding method, and three blood sources (human, cattle, and chicken) were evaluated based on feeding rates, fecundity, and hatching rates of Aedes aegypti. Significance in the variations among blood feeding was investigated by one...

  7. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  8. Design of a setup for {sup 252}Cf neutron source for storage and analysis purpose

    Energy Technology Data Exchange (ETDEWEB)

    Hei, Daqian [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Zhuang, Haocheng [Xi’an Middle School of Shanxi Province, Xi’an 710000 (China); Jia, Wenbao, E-mail: jiawenbao@163.com [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Collaborative Innovation Center of Radiation Medicine of Jiangsu Higher Education Institutions, Suzhou 215000 (China); Cheng, Can; Jiang, Zhou; Wang, Hongtao [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Chen, Da [Department of Nuclear Science and Engineering, College of Materials Science and Engineering, Nanjing University of Aeronautics and Astronautics, Nanjing 211106 (China); Collaborative Innovation Center of Radiation Medicine of Jiangsu Higher Education Institutions, Suzhou 215000 (China)

    2016-11-01

    {sup 252}Cf is a reliable isotopic neutron source and widely used in the prompt gamma ray neutron activation analysis (PGNAA) technique. A cylindrical barrel made by polymethyl methacrylate contained with the boric acid solution was designed for storage and application of a 5 μg {sup 252}Cf neutron source. The size of the setup was optimized with Monte Carlo code. The experiments were performed and the results showed the doses were reduced with the setup and less than the allowable limit. The intensity and collimating radius of the neutron beam could also be adjusted through different collimator.

  9. Comparison between correlated sampling and the perturbation technique of MCNP5 for fixed-source problems

    International Nuclear Information System (INIS)

    He Tao; Su Bingjing

    2011-01-01

    Highlights: → The performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. → In terms of precision, the MCNP perturbation technique outperforms correlated sampling for one type of problem but performs comparably with or even under-performs correlated sampling for the other two types of problems. → In terms of accuracy, the MCNP perturbation calculations may predict inaccurate results for some of the test problems. However, the accuracy can be improved if the midpoint correction technique is used. - Abstract: Correlated sampling and the differential operator perturbation technique are two methods that enable MCNP (Monte Carlo N-Particle) to simulate small response change between an original system and a perturbed system. In this work the performance of the MCNP differential operator perturbation technique is compared with that of the MCNP correlated sampling method for three types of fixed-source problems. In terms of precision of predicted response changes, the MCNP perturbation technique outperforms correlated sampling for the problem involving variation of nuclide concentrations in the same direction but performs comparably with or even underperforms correlated sampling for the other two types of problems that involve void or variation of nuclide concentrations in opposite directions. In terms of accuracy, the MCNP differential operator perturbation calculations may predict inaccurate results that deviate from the benchmarks well beyond their uncertainty ranges for some of the test problems. However, the accuracy of the MCNP differential operator perturbation can be improved if the midpoint correction technique is used.

  10. Polar source analysis : technical memorandum

    Science.gov (United States)

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  11. Analysis of inconsistent source sampling in monte carlo weight-window variance reduction methods

    Directory of Open Access Journals (Sweden)

    David P. Griesheimer

    2017-09-01

    Full Text Available The application of Monte Carlo (MC to large-scale fixed-source problems has recently become possible with new hybrid methods that automate generation of parameters for variance reduction techniques. Two common variance reduction techniques, weight windows and source biasing, have been automated and popularized by the consistent adjoint-driven importance sampling (CADIS method. This method uses the adjoint solution from an inexpensive deterministic calculation to define a consistent set of weight windows and source particles for a subsequent MC calculation. One of the motivations for source consistency is to avoid the splitting or rouletting of particles at birth, which requires computational resources. However, it is not always possible or desirable to implement such consistency, which results in inconsistent source biasing. This paper develops an original framework that mathematically expresses the coupling of the weight window and source biasing techniques, allowing the authors to explore the impact of inconsistent source sampling on the variance of MC results. A numerical experiment supports this new framework and suggests that certain classes of problems may be relatively insensitive to inconsistent source sampling schemes with moderate levels of splitting and rouletting.

  12. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  13. Impedance source power electronic converters

    CERN Document Server

    Liu, Yushan; Ge, Baoming; Blaabjerg, Frede; Ellabban, Omar; Loh, Poh Chiang

    2016-01-01

    Impedance Source Power Electronic Converters brings together state of the art knowledge and cutting edge techniques in various stages of research related to the ever more popular impedance source converters/inverters. Significant research efforts are underway to develop commercially viable and technically feasible, efficient and reliable power converters for renewable energy, electric transportation and for various industrial applications. This book provides a detailed understanding of the concepts, designs, controls, and application demonstrations of the impedance source converters/inverters. Key features: Comprehensive analysis of the impedance source converter/inverter topologies, including typical topologies and derived topologies. Fully explains the design and control techniques of impedance source converters/inverters, including hardware design and control parameter design for corresponding control methods. Presents the latest power conversion solutions that aim to advance the role of pow...

  14. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  15. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  16. INAA in combination with other analytical techniques in the study of urban aerosol sources

    International Nuclear Information System (INIS)

    Binh, N.T.; Truong, Y.; Ngo, N.T.; Sieu, L.N.; Hien, P.D.

    2000-01-01

    Concentrations of elements in fine and coarse PM10 samples collected in Ho Chi Minh City were determined by INAA for the purpose of characterising air pollution sources using multivariate receptor modeling techniques. Seven sources common to coarse and fine samples were identified. Resuspended soil dust is dominant in the coarse samples accounting for 41% of the particulate mass. In the fine samples, vehicle emissions and coal burning are most important accounting for about 20% each. Although a great number of elements were included in the input data for receptor modeling, the interpretation of emission sources was not always straightforward. Information on other source markers were needed. Therefore, a polarography method was used for quantifying lead, and recently, ion chromatography method became available for quantifying secondary sulphates, nitrates and other water soluble ions. (author)

  17. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  18. Applicability of annular-source excited systems in quantitative XRF analysis

    International Nuclear Information System (INIS)

    Mahmoud, A.; Bernasconi, G.; Bamford, S.A.; Dosan, B.; Haselberger, N.; Markowicz, A.

    1996-01-01

    Radioisotope-excited XRF systems, using annular sources, are widely used in view of their simplicity, wide availability, relatively low price for the complete system and good overall performance with respect to accuracy and detection limits. However some problems arise when the use of fundamental parameter techniques for quantitative analysis is attempted. These problems are due to the fact that the systems operate with large solid angles for incoming and emerging radiation and both the incident and take-off angles are not trivial. In this paper an improved way to calculate effective values for the incident and take-off angles, using monte Carlo (M C) integration techniques is shown. In addition, a study of the applicability of the effective angles for analysing different samples, or standards was carried out. The M C method allows also calculation of the excitation-detection efficiency for different parts of the sample and estimation of the overall efficiency of a source-excited XRF setup. The former information is useful in the design of optimized XRF set-ups and prediction of the response of inhomogeneous samples. A study of the sensitivity of the results due to sample characteristics and a comparison of the results with experimentally determined values for incident and take-off angles is also presented. A flexible and user-friendly computer program was developed in order to perform efficiently the lengthy calculation involved. (author). 14 refs. 5 figs

  19. Use of the spectral analysis for estimating the intensity of a weak periodic source

    International Nuclear Information System (INIS)

    Marseguerra, M.

    1989-01-01

    This paper deals with the possibility of exploiting spectral methods for the analysis of counting experiments in which one has to estimate the intensity of a weak periodic source of particles buried in a high background. The general theoretical expressions here obtained for the auto- and cross-spectra are applied to three kinds of simulated experiments. In all cases it turns out that the source intensity can acutally be estimated with a standard deviation comparable with that obtained in classical experiments in which the source can be moved out. Thus the spectral methods represent an interesting technique nowadays easy to implement on low-cost computers which could also be used in many research fields by suitably redesigning classical experiments. The convenience of using these methods in the field of nuclear safeguards is presently investigated in our Institute. (orig.)

  20. Distributed Source Coding Techniques for Lossless Compression of Hyperspectral Images

    Directory of Open Access Journals (Sweden)

    Barni Mauro

    2007-01-01

    Full Text Available This paper deals with the application of distributed source coding (DSC theory to remote sensing image compression. Although DSC exhibits a significant potential in many application fields, up till now the results obtained on real signals fall short of the theoretical bounds, and often impose additional system-level constraints. The objective of this paper is to assess the potential of DSC for lossless image compression carried out onboard a remote platform. We first provide a brief overview of DSC of correlated information sources. We then focus on onboard lossless image compression, and apply DSC techniques in order to reduce the complexity of the onboard encoder, at the expense of the decoder's, by exploiting the correlation of different bands of a hyperspectral dataset. Specifically, we propose two different compression schemes, one based on powerful binary error-correcting codes employed as source codes, and one based on simpler multilevel coset codes. The performance of both schemes is evaluated on a few AVIRIS scenes, and is compared with other state-of-the-art 2D and 3D coders. Both schemes turn out to achieve competitive compression performance, and one of them also has reduced complexity. Based on these results, we highlight the main issues that are still to be solved to further improve the performance of DSC-based remote sensing systems.

  1. Measuring trace gas emission from multi-distributed sources using vertical radial plume mapping (VRPM) and backward Lagrangian stochastic (bLS) techniques

    Science.gov (United States)

    Two micrometeorological techniques for measuring trace gas emission rates from distributed area sources were evaluated using a variety of synthetic area sources. The accuracy of the vertical radial plume mapping (VRPM) and the backward Lagrangian (bLS) techniques with an open-path optical spectrosco...

  2. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  3. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    International Nuclear Information System (INIS)

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-01-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented

  4. Determination of sources and analysis of micro-pollutants in drinking water

    International Nuclear Information System (INIS)

    Md Pauzi Abdullah; Soh Shiau Chian

    2005-01-01

    The objectives of the study are to develop and validate selected analytical methods for the analysis of micro organics and metals in water; to identify, monitor and assess the levels of micro organics and metals in drinking water supplies; to evaluate the relevancy of the guidelines set in the National Standard of Drinking Water Quality 2001; and to identify the sources of pollution and to carryout risk assessment of exposure to drinking water. The presentation discussed the progress of the work include determination of VOCs (Volatile organic compounds) in drinking water using SPME (Solid phase micro-extraction) extraction techniques, analysis of heavy metals in drinking water, determination of Cr(VI) with ICPES (Inductively coupled plasma emission spectrometry) and the presence of halogenated volatile organic compounds (HVOCs), which is heavily used by agricultural sector, in trace concentrations in waters

  5. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  6. Micro-electrodeposition techniques for the preparation of small actinide counting sources for ultra-high resolution alpha spectrometry by microcalorimetry

    International Nuclear Information System (INIS)

    Plionis, A.A.; Hastings, E.P.; LaMont, S.P.; Dry, D.E.; Bacrania, M.K.; Rabin, M.W.; Rim, J.H.

    2009-01-01

    Special considerations and techniques are desired for the preparation of small actinide counting sources. Counting sources have been prepared on metal disk substrates (planchets) with an active area of only 0.079 mm 2 . This represents a 93.75% reduction in deposition area from standard electrodeposition methods. The actinide distribution upon the smaller planchet must remain thin and uniform to allow alpha particle emissions to escape the counting source with a minimal amount of self-attenuation. This work describes the development of micro-electrodeposition methods and optimization of the technique with respect to deposition time and current density for various planchet sizes. (author)

  7. Identification of sources of tar balls deposited along the Goa coast, India, using fingerprinting techniques

    International Nuclear Information System (INIS)

    Suneel, V.; Vethamony, P.; Zakaria, M.P.; Naik, B.G.; Prasad, K.V.S.R.

    2013-01-01

    Highlights: ► This is first fingerprinting study in India on identification of source of tar balls. ► Tar balls were formed from tanker-wash spills and they resemble floating tar ball. ► δ 13 C values of Bombay High crude oil and the present tar balls do not match. ► Compound specific stable carbon isotope analysis confirmed the source of tar balls. ► Source is confirmed as the South East Asian Crude Oil and not the Bombay High crude. -- Abstract: Deposition of tar balls along the coast of Goa, India is a common phenomenon during the southwest monsoon. Representative tar ball samples collected from various beaches of Goa and one Bombay High (BH) crude oil sample were subjected to fingerprint analysis based on diagnostic ratios of n-alkane, biomarkers of pentacyclic tri-terpanes and compound specific stable carbon isotope (δ 13 C) analysis to confirm the source. The results were compared with the published data of Middle East Crude Oil (MECO) and South East Asian Crude Oil (SEACO). The results revealed that the tar balls were from tanker-wash derived spills. The study also confirmed that the source is not the BH, but SEACO. The present study suggests that the biomarkers of alkanes and hopanes coupled with stable carbon isotope analysis act as a powerful tool for tracing the source of tar balls, particularly when the source specific biomarkers fail to distinguish the source

  8. Development of oil hydrocarbon fingerprinting and identification techniques

    International Nuclear Information System (INIS)

    Wang Zhendi; Fingas, Merv F.

    2003-01-01

    Oil, refined product, and pyrogenic hydrocarbons are the most frequently discovered contaminants in the environment. To effectively determine the fate of spilled oil in the environment and to successfully identify source(s) of spilled oil and petroleum products is, therefore, extremely important in many oil-related environmental studies and liability cases. This article briefly reviews the recent development of chemical analysis methodologies which are most frequently used in oil spill characterization and identification studies and environmental forensic investigations. The fingerprinting and data interpretation techniques discussed include oil spill identification protocol, tiered analytical approach, generic features and chemical composition of oils, effects of weathering on hydrocarbon fingerprinting, recognition of distribution patterns of petroleum hydrocarbons, oil type screening and differentiation, analysis of 'source-specific marker' compounds, determination of diagnostic ratios of specific oil constituents, stable isotopic analysis, application of various statistical and numerical analysis tools, and application of other analytical techniques. The issue of how biogenic and pyrogenic hydrocarbons are distinguished from petrogenic hydrocarbons is also addressed

  9. Fetal source extraction from magnetocardiographic recordings by dependent component analysis

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Draulio B de [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Barros, Allan Kardec [Department of Electrical Engineering, Federal University of Maranhao, Sao Luis, Maranhao (Brazil); Estombelo-Montesco, Carlos [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Zhao, Hui [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Filho, A C Roque da Silva [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Baffa, Oswaldo [Department of Physics and Mathematics, FFCLRP, University of Sao Paulo, Ribeirao Preto, SP (Brazil); Wakai, Ronald [Department of Medical Physics, University of Wisconsin, Madison, WI (United States); Ohnishi, Noboru [Department of Information Engineering, Nagoya University (Japan)

    2005-10-07

    Fetal magnetocardiography (fMCG) has been extensively reported in the literature as a non-invasive, prenatal technique that can be used to monitor various functions of the fetal heart. However, fMCG signals often have low signal-to-noise ratio (SNR) and are contaminated by strong interference from the mother's magnetocardiogram signal. A promising, efficient tool for extracting signals, even under low SNR conditions, is blind source separation (BSS), or independent component analysis (ICA). Herein we propose an algorithm based on a variation of ICA, where the signal of interest is extracted using a time delay obtained from an autocorrelation analysis. We model the system using autoregression, and identify the signal component of interest from the poles of the autocorrelation function. We show that the method is effective in removing the maternal signal, and is computationally efficient. We also compare our results to more established ICA methods, such as FastICA.

  10. Intelligent Techniques Using Molecular Data Analysis in Leukaemia: An Opportunity for Personalized Medicine Support System.

    Science.gov (United States)

    Banjar, Haneen; Adelson, David; Brown, Fred; Chaudhri, Naeem

    2017-01-01

    The use of intelligent techniques in medicine has brought a ray of hope in terms of treating leukaemia patients. Personalized treatment uses patient's genetic profile to select a mode of treatment. This process makes use of molecular technology and machine learning, to determine the most suitable approach to treating a leukaemia patient. Until now, no reviews have been published from a computational perspective concerning the development of personalized medicine intelligent techniques for leukaemia patients using molecular data analysis. This review studies the published empirical research on personalized medicine in leukaemia and synthesizes findings across studies related to intelligence techniques in leukaemia, with specific attention to particular categories of these studies to help identify opportunities for further research into personalized medicine support systems in chronic myeloid leukaemia. A systematic search was carried out to identify studies using intelligence techniques in leukaemia and to categorize these studies based on leukaemia type and also the task, data source, and purpose of the studies. Most studies used molecular data analysis for personalized medicine, but future advancement for leukaemia patients requires molecular models that use advanced machine-learning methods to automate decision-making in treatment management to deliver supportive medical information to the patient in clinical practice.

  11. A Comparative Analysis of Information Hiding Techniques for Copyright Protection of Text Documents

    Directory of Open Access Journals (Sweden)

    Milad Taleby Ahvanooey

    2018-01-01

    Full Text Available With the ceaseless usage of web and other online services, it has turned out that copying, sharing, and transmitting digital media over the Internet are amazingly simple. Since the text is one of the main available data sources and most widely used digital media on the Internet, the significant part of websites, books, articles, daily papers, and so on is just the plain text. Therefore, copyrights protection of plain texts is still a remaining issue that must be improved in order to provide proof of ownership and obtain the desired accuracy. During the last decade, digital watermarking and steganography techniques have been used as alternatives to prevent tampering, distortion, and media forgery and also to protect both copyright and authentication. This paper presents a comparative analysis of information hiding techniques, especially on those ones which are focused on modifying the structure and content of digital texts. Herein, various text watermarking and text steganography techniques characteristics are highlighted along with their applications. In addition, various types of attacks are described and their effects are analyzed in order to highlight the advantages and weaknesses of current techniques. Finally, some guidelines and directions are suggested for future works.

  12. Techniques for the thermal/hydraulic analysis of LMFBR check valves

    International Nuclear Information System (INIS)

    Cho, S.M.; Kane, R.S.

    1979-01-01

    A thermal/hydraulic analysis of the check valves in liquid sodium service for LMFBR plants is required to provide temperature data for thermal stress analysis of the valves for specified transient conditions. Because of the complex three-dimensional flow pattern within the valve, the heat transfer analysis techniques for less complicated shapes could not be used. This paper discusses the thermal analysis techniques used to assure that the valve stress analysis is conservative. These techniques include a method for evaluating the recirculating flow patterns and for selecting appropriately conservative heat transfer correlations in various regions of the valve

  13. BSDWormer; an Open Source Implementation of a Poisson Wavelet Multiscale Analysis for Potential Fields

    Science.gov (United States)

    Horowitz, F. G.; Gaede, O.

    2014-12-01

    Wavelet multiscale edge analysis of potential fields (a.k.a. "worms") has been known since Moreau et al. (1997) and was independently derived by Hornby et al. (1999). The technique is useful for producing a scale-explicit overview of the structures beneath a gravity or magnetic survey, including establishing the location and estimating the attitude of surface features, as well as incorporating information about the geometric class (point, line, surface, volume, fractal) of the underlying sources — in a fashion much like traditional structural indices from Euler solutions albeit with better areal coverage. Hornby et al. (2002) show that worms form the locally highest concentration of horizontal edges of a given strike — which in conjunction with the results from Mallat and Zhong (1992) induces a (non-unique!) inversion where the worms are physically interpretable as lateral boundaries in a source distribution that produces a close approximation of the observed potential field. The technique has enjoyed widespread adoption and success in the Australian mineral exploration community — including "ground truth" via successfully drilling structures indicated by the worms. Unfortunately, to our knowledge, all implementations of the code to calculate the worms/multiscale edges (including Horowitz' original research code) are either part of commercial software packages, or have copyright restrictions that impede the use of the technique by the wider community. The technique is completely described mathematically in Hornby et al. (1999) along with some later publications. This enables us to re-implement from scratch the code required to calculate and visualize the worms. We are freely releasing the results under an (open source) BSD two-clause software license. A git repository is available at . We will give an overview of the technique, show code snippets using the codebase, and present visualization results for example datasets (including the Surat basin of Australia

  14. An innovative technique to synthesize C-doped MgB2 by using chitosan as the carbon source

    International Nuclear Information System (INIS)

    Bovone, G; Kawale, S; Siri, A S; Vignolo, M; Bernini, C

    2014-01-01

    Here, we report a new technique to synthesize carbon-doped MgB 2 powder. Chitosan was innovatively used as the carbon source during the synthesis of boron from boron oxide. This allowed the introduction of local defects, which later on served as pinning centers in MgB 2 , in the boron lattice itself, avoiding the traditional and time consuming ways of ex situ MgB 2 doping (e.g. ball milling). Two volume percentages of C-doping have been tried and its effect on the superconducting properties, evaluated by magnetic and transport measurements, are discussed here. Morphological analysis by scanning electron microscopy revealed nano-metric grains’ distribution in the boron and MgB 2 powder. Mono-filamentary MgB 2 wires have been fabricated by an ex situ powder-in-tube technique by using the thus prepared carbon-doped MgB 2 and pure MgB 2 powders. Transport property measurements on these wires were made and compared with MgB 2 wire produced using commercial boron. (fast track communication)

  15. Acoustic emission non-destructive testing of structures using source location techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Beattie, Alan G.

    2013-09-01

    The technology of acoustic emission (AE) testing has been advanced and used at Sandia for the past 40 years. AE has been used on structures including pressure vessels, fire bottles, wind turbines, gas wells, nuclear weapons, and solar collectors. This monograph begins with background topics in acoustics and instrumentation and then focuses on current acoustic emission technology. It covers the overall design and system setups for a test, with a wind turbine blade as the object. Test analysis is discussed with an emphasis on source location. Three test examples are presented, two on experimental wind turbine blades and one on aircraft fire extinguisher bottles. Finally, the code for a FORTRAN source location program is given as an example of a working analysis program. Throughout the document, the stress is on actual testing of real structures, not on laboratory experiments.

  16. Sample preparation techniques in trace element analysis by X-ray emission spectroscopy

    International Nuclear Information System (INIS)

    Valkovic, V.

    1983-11-01

    The report, written under a research contract with the IAEA, contains a detailed presentation of the most difficult problem encountered in the trace element analysis by methods of the X-ray emission spectroscopy, namely the sample preparation techniques. The following items are covered. Sampling - with specific consideration of aerosols, water, soil, biological materials, petroleum and its products, storage of samples and their handling. Pretreatment of samples - preconcentration, ashing, solvent extraction, ion exchange and electrodeposition. Sample preparations for PIXE - analysis - backings, target uniformity and homogeneity, effects of irradiation, internal standards and specific examples of preparation (aqueous, biological, blood serum and solid samples). Sample preparations for radioactive sources or tube excitation - with specific examples (water, liquid and solid samples, soil, geological, plants and tissue samples). Finally, the problem of standards and reference materials, as well as that of interlaboratory comparisons, is discussed

  17. Techniques for Handling Channeling in High Resolution Fourier Transform Spectra Recorded with Synchrotron Sources

    International Nuclear Information System (INIS)

    Ibrahim, Amr; PredoiCross, Adriana; Teillet, P. M.

    2010-01-01

    Seven different techniques in dealing the problem of channel spectra in Fourier transform Spectroscopy utilizing synchrotron source were examined and compared. Five of these techniques deal with the artifacts (spikes) in the recorded interferogram which in turn result in channel spectra within the spectral domain. Such interferogram editing method include replacing these spikes with zeros, straight line, fitted polynomial curve, rescaled spike and spike reduced with Gauss Function. Another two techniques try to target this issue in the spectral domain instead by either generating a synthetic background simulating the channels or measuring the channels parameters (amplitude, spacing and phase) to use in the spectral fitting program. Results showed spectral domain techniques produces higher quality results in terms of signal to noise and fitting residual. The effect of each method on the line parameters such as position, intensity are air broadening are also measured and discussed.

  18. The characterisation of Melanesian obsidian sources and artefacts using the proton induced gamma-ray emission (PIGME) technique

    International Nuclear Information System (INIS)

    Bird, J.R.; Ambrose, W.R.; Russell, L.H.; Scott, M.D.

    1981-09-01

    Proton induced gamma-ray emission (PIGME) has been used to determine F, Na and Al concentrations in obsidian from known locations in Melanesia and to relate artefacts from this region to such sources. The PIGME technique is a fast, non-destructive, and accurate method for determining these three elements with essentially no special sample preparation. The measuring technique is described and results are listed for sources, chiefly in the Papua New Guinea region. Their classification is discussed in terms of groups which are distinguishable by the PIGME method. Over 700 artefact results are listed; these show the occurrence of an additional group that is not geographically identified

  19. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Investigations on the comparator technique used in epithermal neutron activation analysis

    International Nuclear Information System (INIS)

    Bereznai, T.; Bodizs, D.; Keoemley, G.

    1977-01-01

    The possible extension of the comparator technique of reactor neutron activation analysis into the field of epithermal neutron activation has been investigated. Ruthenium was used for multi-isotopic comparator. Experiments show that conversion of the so-called reference k-factors - determined by irradiation with reactor neutrons - into ksup(epi)-factors usable at activation under cadmium filter, can be evaluated with fair accuracy. Sources and extent of errors and their contribution to the final error of analysis are discussed. For equal irradiation and counting times advantage of ENAA for several elements is obvious: the much lower background activity permitted the sample to be measured closer to the detector, under better geometry conditions, consequently, permitting several elements to be determined quantitatively. The number of elements determined and the sensitivity of the method are much dependent on the experimental conditions, especially on the composition of the sample, on the PHIsub(e) value, the irradiation time and the efficiency of the Ge(Li) detector. (T.G.)

  1. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  2. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  3. Evolution of source term definition and analysis

    International Nuclear Information System (INIS)

    Lutz, R.J. Jr.

    2004-01-01

    The objective of this presentation was to provide an overview of the evolution of accident fission product release analysis methodology and the obtained results; and to provide an overview of the source term implementation analysis in regulatory decisions

  4. Collection, Analysis, and Dissemination of Open Source News and Analysis for Safeguards Implementation and Evaluation

    International Nuclear Information System (INIS)

    Khaled, J.; Reed, J.; Ferguson, M.; Hepworth, C.; Serrat, J.; Priori, M.; Hammond, W.

    2015-01-01

    Analysis of all safeguards-relevant information is an essential component of IAEA safeguards and the ongoing State evaluation underlying IAEA verification activities. In addition to State declared safeguards information and information generated from safeguards activities both in the field and at headquarters, the IAEA collects and analyzes information from a wide array of open sources relevant to States' nuclear related activities. A number of these open sources include information that could be loosely categorized as ''news'': international, regional, and local media; company and government press releases; public records of parliamentary proceedings; and NGO/academic commentaries and analyzes. It is the task of the State Factors Analysis Section of the Department of Safeguards to collect, analyze and disseminate news of relevance to support ongoing State evaluation. This information supports State evaluation by providing the Department with a global overview of safeguards-relevant nuclear developments. Additionally, this type of information can support in-depth analyses of nuclear fuel cycle related activities, alerting State Evaluation Groups to potential inconsistencies in State declarations, and preparing inspectors for activities in the field. The State Factors Analysis Section uses a variety of tools, including subscription services, news aggregators, a roster of specialized sources, and a custom software application developed by an external partner to manage incoming data streams and assist with making sure that critical information is not overlooked. When analyzing data, it is necessary to determine the credibility of a given source and piece of information. Data must be considered for accuracy, bias, and relevance to the overall assessment. Analysts use a variety of methodological techniques to make these types of judgments, which are included when the information is presented to State Evaluation Groups. Dissemination of news to

  5. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2012-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. The standard technique for forced response analysis to assess structural integrity is to decompose a CFD generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. These complications suggest the question of whether frequency domain analysis is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. The results showed that the transient analysis results were up to 10% higher for "clean" nodal diameter excitations and six times larger for "messy" excitations, where substantial Fourier content around the main harmonic exists.

  6. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  7. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  8. Seismic explosion sources on an ice cap

    DEFF Research Database (Denmark)

    Shulgin, Alexey; Thybo, Hans

    2015-01-01

    crustal model can be modelled. A crucial challenge for applying the technique is to control the sources. Here, we present data that describe the efficiency of explosive sources in the ice cover. Analysis of the data shows, that the ice cap traps a significant amount of energy, which is observed......Controlled source seismic investigation of crustal structure below ice covers is an emerging technique. We have recently conducted an explosive refraction/wide-angle reflection seismic experiment on the ice cap in east-central Greenland. The data-quality is high for all shot points and a full...

  9. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  10. Modeling and reliability analysis of three phase z-source AC-AC converter

    Directory of Open Access Journals (Sweden)

    Prasad Hanuman

    2017-12-01

    Full Text Available This paper presents the small signal modeling using the state space averaging technique and reliability analysis of a three-phase z-source ac-ac converter. By controlling the shoot-through duty ratio, it can operate in buck-boost mode and maintain desired output voltage during voltage sag and surge condition. It has faster dynamic response and higher efficiency as compared to the traditional voltage regulator. Small signal analysis derives different control transfer functions and this leads to design a suitable controller for a closed loop system during supply voltage variation. The closed loop system of the converter with a PID controller eliminates the transients in output voltage and provides steady state regulated output. The proposed model designed in the RT-LAB and executed in a field programming gate array (FPGA-based real-time digital simulator at a fixedtime step of 10 μs and a constant switching frequency of 10 kHz. The simulator was developed using very high speed integrated circuit hardware description language (VHDL, making it versatile and moveable. Hardware-in-the-loop (HIL simulation results are presented to justify the MATLAB simulation results during supply voltage variation of the three phase z-source ac-ac converter. The reliability analysis has been applied to the converter to find out the failure rate of its different components.

  11. Source Identification of Heavy Metals in Soils Surrounding the Zanjan Zinc Town by Multivariate Statistical Techniques

    Directory of Open Access Journals (Sweden)

    M.A. Delavar

    2016-02-01

    Full Text Available Introduction: The accumulation of heavy metals (HMs in the soil is of increasing concern due to food safety issues, potential health risks, and the detrimental effects on soil ecosystems. HMs may be considered as the most important soil pollutants, because they are not biodegradable and their physical movement through the soil profile is relatively limited. Therefore, root uptake process may provide a big chance for these pollutants to transfer from the surface soil to natural and cultivated plants, which may eventually steer them to human bodies. The general behavior of HMs in the environment, especially their bioavailability in the soil, is influenced by their origin. Hence, source apportionment of HMs may provide some essential information for better management of polluted soils to restrict the HMs entrance to the human food chain. This paper explores the applicability of multivariate statistical techniques in the identification of probable sources that can control the concentration and distribution of selected HMs in the soils surrounding the Zanjan Zinc Specialized Industrial Town (briefly Zinc Town. Materials and Methods: The area under investigation has a size of approximately 4000 ha.It is located around the Zinc Town, Zanjan province. A regular grid sampling pattern with an interval of 500 meters was applied to identify the sample location, and 184 topsoil samples (0-10 cm were collected. The soil samples were air-dried and sieved through a 2 mm polyethylene sieve and then, were digested using HNO3. The total concentrations of zinc (Zn, lead (Pb, cadmium (Cd, Nickel (Ni and copper (Cu in the soil solutions were determined via Atomic Absorption Spectroscopy (AAS. Data were statistically analyzed using the SPSS software version 17.0 for Windows. Correlation Matrix (CM, Principal Component Analyses (PCA and Factor Analyses (FA techniques were performed in order to identify the probable sources of HMs in the studied soils. Results and

  12. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  13. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  14. Multi elemental analysis of indigenous food spices in Southern Ethiopia using INAA technique

    International Nuclear Information System (INIS)

    Melkegna, T.H.; Chaubey, A.K.; Beyene, G.A.; Bitewlign, T.A.

    2017-01-01

    The objective of this study is a quantitative and qualitative analysis of essential and trace elements of four indigenous Ethiopian spices and herbs using instrumental neutron activation analysis technique. Results obtained for 16 elements: Major elements; Mg, Cl, and K; Minor elements; Na, Fe, and Mn, Zn, Br. While Al, V, Sm, Sc, La, Ba, Eu, Rb were found in traces. The spices, Affromumom korarima and Lippa Adonesis var. Koseret sebsebe were found to be very good sources of essential trace elements like Fe, Zn and Mn. The highest concentration of Mg was found in Ajowan whereas K and Fe were measured in Coriander seeds. The average daily dietary intake of some essential elements from the use of these spices were found to be below the recommended upper limit by WHO. (author)

  15. Heat transfer monitoring by means of the hot wire technique and finite element analysis software.

    Science.gov (United States)

    Hernández Wong, J; Suarez, V; Guarachi, J; Calderón, A; Rojas-Trigos, J B; Juárez, A G; Marín, E

    2014-01-01

    It is reported the study of the radial heat transfer in a homogeneous and isotropic substance with a heat linear source in its axial axis. For this purpose, the hot wire characterization technique has been used, in order to obtain the temperature distribution as a function of radial distance from the axial axis and time exposure. Also, the solution of the transient heat transport equation for this problem was obtained under appropriate boundary conditions, by means of finite element technique. A comparison between experimental, conventional theoretical model and numerical simulated results is done to demonstrate the utility of the finite element analysis simulation methodology in the investigation of the thermal response of substances. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Radiotracer and sealed source techniques for sediment management. Report of the consultants meeting

    International Nuclear Information System (INIS)

    2008-01-01

    Radioisotopes as tracers and sealed sources have been a useful and often irreplaceable tool for sediment transport studies. Gamma scattering and transmission gauges are used for sediment monitoring. Computational fluid dynamics (CFD) modelling is now an essential tool for the management of the natural systems and are increasingly used to study the fate and behaviour of particulates and contaminants. Radiotracer techniques are often employed to validate CFD models to enhance confidence in the predictive value of the models. Experimental tracing and numerical modelling are complementary methods of studying complex systems. During the last few decades, many radiotracer studies for the investigation of sediment transport in natural systems have been conducted worldwide, and various techniques for tracing and monitoring sediment have been developed by individual tracer groups. However, the developed techniques and methods for sediment tracing have not been compiled yet as a technical document, which is essential for the preservation of the knowledge and transfer of the technology to developing countries. Standard procedures or guidelines for the tracer experiments, which are vital for the reliability of the experiments and the acceptance of end-users, have not been established by the international tracer community either. The use of radiotracers in sediment transport studies demands the additional attention of the community to further develop these techniques and to ensure their transfer to developing countries. The Consultants' Meeting on 'Radiotracer and sealed source techniques for sediment management' was convened at the headquarters of the International Atomic Energy Agency (IAEA) in Vienna, Austria, from 21 to 25 April 2008. Experts from Argentina, Brazil, France, India, Republic of Korea and United Kingdom have been invited to discuss the current status of the tracer and nucleonic gauge technologies as applied for sediment transport investigations and to evaluate

  17. Radiotracer and sealed source techniques for sediment management. Report of the consultants meeting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2008-07-01

    Radioisotopes as tracers and sealed sources have been a useful and often irreplaceable tool for sediment transport studies. Gamma scattering and transmission gauges are used for sediment monitoring. Computational fluid dynamics (CFD) modelling is now an essential tool for the management of the natural systems and are increasingly used to study the fate and behaviour of particulates and contaminants. Radiotracer techniques are often employed to validate CFD models to enhance confidence in the predictive value of the models. Experimental tracing and numerical modelling are complementary methods of studying complex systems. During the last few decades, many radiotracer studies for the investigation of sediment transport in natural systems have been conducted worldwide, and various techniques for tracing and monitoring sediment have been developed by individual tracer groups. However, the developed techniques and methods for sediment tracing have not been compiled yet as a technical document, which is essential for the preservation of the knowledge and transfer of the technology to developing countries. Standard procedures or guidelines for the tracer experiments, which are vital for the reliability of the experiments and the acceptance of end-users, have not been established by the international tracer community either. The use of radiotracers in sediment transport studies demands the additional attention of the community to further develop these techniques and to ensure their transfer to developing countries. The Consultants' Meeting on 'Radiotracer and sealed source techniques for sediment management' was convened at the headquarters of the International Atomic Energy Agency (IAEA) in Vienna, Austria, from 21 to 25 April 2008. Experts from Argentina, Brazil, France, India, Republic of Korea and United Kingdom have been invited to discuss the current status of the tracer and nucleonic gauge technologies as applied for sediment transport investigations and to evaluate

  18. Environmental forensic principals for sources allocation of polycyclic aromatic hydrocarbons

    International Nuclear Information System (INIS)

    O'Sullivan, G.; Martin, E.; Sandau, C.D.

    2008-01-01

    Polycyclic aromatic hydrocarbons (PAH) are organic compounds which include only carbon and hydrogen with a fused ring structure containing at least two six-sided benzene rings but may also contain additional fused rings that are not six-sided. The environmental forensic principals for sources allocation of PAHs were examined in this presentation. Specifically, the presentation addressed the structure and physiochemical properties of PAHs; sources and sinks; fate and behaviour; analytical techniques; conventional source identification techniques; and toxic equivalent fingerprinting. It presented a case study where residents had been allegedly exposed to dioxins, PAHs and metals released from a railroad tie treatment plant. The classification of PAHs is governed by thermodynamic properties such as biogenic, petrogenic, and pyrogenic properties. A number of techniques were completed, including chemical fingerprinting; molecular diagnostic ratios; cluster analysis; principal component analysis; and TEF fingerprinting. These techniques have shown that suspected impacted sites do not all share similar PAH signatures indicating the potential for various sources. Several sites shared similar signatures to background locations. tabs., figs

  19. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  20. Spectrography analysis of stainless steel by the point to point technique

    International Nuclear Information System (INIS)

    Bona, A.

    1986-01-01

    A method for the determination of the elements Ni, Cr, Mn, Si, Mo, Nb, Cu, Co and V in stainless steel by emission spectrographic analysis using high voltage spark sources is presented. The 'point-to-point' technique is employed. The experimental parameters were optimized taking account a compromise between the detection sensitivity and the precision of the measurement. The parameters investigated were the high voltage capacitance, the inductance, the analytical and auxiliary gaps, the period of pre burn spark and the time of exposure. The edge shape of the counter electrodes and the type of polishing and diameter variation of the stailess steel eletrodes were evaluated in preliminary assays. In addition the degradation of the chemical power of the developer was also investigated. Counter electrodes of graphite, copper, aluminium and iron were employed and the counter electrode itself was used as an internal standard. In the case of graphite counter electrodes the iron lines were employed as internal standard. The relative errors were the criteria for evaluation of these experiments. The National Bureau of Standards - Certified reference stainless steel standards and the Eletrometal Acos Finos S.A. samples (certified by the supplier) were employed for drawing in the calibration systems and analytical curves. The best results were obtained using the convencional graphite counter electrodes. The inaccuracy and the imprecision of the proposed method varied from 2% to 15% and from 1% to 9% respectively. This present technique was compared to others instrumental techniques such as inductively coupled plasma, X-ray fluorescence and neutron activation analysis. The advantages and disadvantages for each case were discussed. (author) [pt

  1. Analysis of geological material and especially ores by means of a 252Cf source

    International Nuclear Information System (INIS)

    Barrandon, J.N.; Borderie, B.; Melky, S.; Halfon, J.; Marce, A.

    1976-01-01

    Tests were made on the possibilities for analysis by 252 Cf activation in the earth sciences and mining research. The results obtained show that while 252 Cf activation can only resolve certain very specific geochemical research problems, it does allow the exact and rapid determination of numerous elements whose ores are of great economic importance such as fluorine, titanium, vanadium, manganese, copper, antimony, barium, and tungsten. The utilization of activation analysis methods in the earth sciences is not a recent phenomenon. It has generally been limited to the analysis of traces in relatively small volumes by means of irradiation in nuclear reactors. Traditional neutron sources were little used and were not very applicable. The development of 252 Cf isotopic sources emitting more intense neutron fluxes make it possible to consider carrying out more sensitive determinations without making use of a nuclear reactor. In addition, this technique can be adapted for in situ analysis in mines and mine borings. Our work which is centered upon the possibilities of instrumental laboratory analyses of geological materials through 252 Cf activation is oriented in two principal directions: the study of the experimental sensitivities of the various elements in different rocks with the usual compositions; and the study of the possibilities for routine ore analyses

  2. Source Location of Noble Gas Plumes

    International Nuclear Information System (INIS)

    Hoffman, I.; Ungar, K.; Bourgouin, P.; Yee, E.; Wotawa, G.

    2015-01-01

    In radionuclide monitoring, one of the most significant challenges from a verification or surveillance perspective is the source location problem. Modern monitoring/surveillance systems employ meteorological source reconstruction — for example, the Fukushima accident, CRL emissions analysis and even radon risk mapping. These studies usually take weeks to months to conduct, involving multidisciplinary teams representing meteorology; dispersion modelling; radionuclide sampling and metrology; and, when relevant, proper representation of source characteristics (e.g., reactor engineering expertise). Several different approaches have been tried in an attempt to determine useful techniques to apply to the source location problem and to develop rigorous methods that combine all potentially relevant observations and models to identify a most probable source location and size with uncertainties. The ultimate goal is to understand the utility and limitations of these techniques so they can transition from R&D to operational tools. (author)

  3. Source localization analysis using seismic noise data acquired in exploration geophysics

    Science.gov (United States)

    Roux, P.; Corciulo, M.; Campillo, M.; Dubuq, D.

    2011-12-01

    Passive monitoring using seismic noise data shows a growing interest at exploration scale. Recent studies demonstrated source localization capability using seismic noise cross-correlation at observation scales ranging from hundreds of kilometers to meters. In the context of exploration geophysics, classical localization methods using travel-time picking fail when no evident first arrivals can be detected. Likewise, methods based on the intensity decrease as a function of distance to the source also fail when the noise intensity decay gets more complicated than the power-law expected from geometrical spreading. We propose here an automatic procedure developed in ocean acoustics that permits to iteratively locate the dominant and secondary noise sources. The Matched-Field Processing (MFP) technique is based on the spatial coherence of raw noise signals acquired on a dense array of receivers in order to produce high-resolution source localizations. Standard MFP algorithms permits to locate the dominant noise source by matching the seismic noise Cross-Spectral Density Matrix (CSDM) with the equivalent CSDM calculated from a model and a surrogate source position that scans each position of a 3D grid below the array of seismic sensors. However, at exploration scale, the background noise is mostly dominated by surface noise sources related to human activities (roads, industrial platforms,..), which localization is of no interest for the monitoring of the hydrocarbon reservoir. In other words, the dominant noise sources mask lower-amplitude noise sources associated to the extraction process (in the volume). Their location is therefore difficult through standard MFP technique. The Multi-Rate Adaptative Beamforming (MRABF) is a further improvement of the MFP technique that permits to locate low-amplitude secondary noise sources using a projector matrix calculated from the eigen-value decomposition of the CSDM matrix. The MRABF approach aims at cancelling the contributions of

  4. Solving the forward problem in EEG source analysis by spherical and fdm head modeling: a comparative analysis - biomed 2009

    NARCIS (Netherlands)

    Vatta, F.; Meneghini, F.; Esposito, F.; Mininel, S.; Di Salle, F.

    2009-01-01

    Neural source localization techniques based on electroencephalography (EEG) use scalp potential data to infer the location of underlying neural activity. This procedure entails modeling the sources of EEG activity and modeling the head volume conduction process to link the modeled sources to the

  5. Xenon ventilation CT using dual-source and dual-energy technique in children with bronchiolitis obliterans: correlation of xenon and CT density values with pulmonary function test results

    International Nuclear Information System (INIS)

    Goo, Hyun Woo; Yang, Dong Hyun; Seo, Joon Beom; Chae, Eun Jin; Lee, Jeongjin; Hong, Soo-Jong; Yu, Jinho; Kim, Byoung-Ju; Krauss, Bernhard

    2010-01-01

    Xenon ventilation CT using dual-source and dual-energy technique is a recently introduced, promising functional lung imaging method. To expand its clinical applications evidence of additional diagnostic value of xenon ventilation CT over conventional chest CT is required. To evaluate the usefulness of xenon ventilation CT using dual-source and dual-energy technique in children with bronchiolitis obliterans (BO). Seventeen children (age 7-18 years; 11 boys) with BO underwent xenon ventilation CT using dual-source and dual-energy technique. Xenon and CT density values were measured in normal and hyperlucent lung regions on CT and were compared between the two regions. Volumes of hyperlucent regions and ventilation defects were calculated with thresholds determined by visual and histogram-based analysis. Indexed volumes of hyperlucent lung regions and ventilation defects were correlated with pulmonary function test results. Effective doses of xenon CT were calculated. Xenon (14.6 ± 6.4 HU vs 26.1 ± 6.5 HU; P 25-75 , (γ = -0.68-0.88, P ≤ 0.002). Volume percentages of xenon ventilation defects (35.0 ± 16.4%)] were not significantly different from those of hyperlucent lung regions (38.2 ± 18.6%). However, mismatches between the volume percentages were variable up to 21.4-33.3%. Mean effective dose of xenon CT was 1.9 ± 0.5 mSv. In addition to high-resolution anatomic information, xenon ventilation CT using dual-source and dual-energy technique demonstrates impaired regional ventilation and its heterogeneity accurately in children with BO without additional radiation exposure. (orig.)

  6. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  7. Security analysis of an untrusted source for quantum key distribution: passive approach

    International Nuclear Information System (INIS)

    Zhao Yi; Qi Bing; Lo, H-K; Qian Li

    2010-01-01

    We present a passive approach to the security analysis of quantum key distribution (QKD) with an untrusted source. A complete proof of its unconditional security is also presented. This scheme has significant advantages in real-life implementations as it does not require fast optical switching or a quantum random number generator. The essential idea is to use a beam splitter to split each input pulse. We show that we can characterize the source using a cross-estimate technique without active routing of each pulse. We have derived analytical expressions for the passive estimation scheme. Moreover, using simulations, we have considered four real-life imperfections: additional loss introduced by the 'plug and play' structure, inefficiency of the intensity monitor noise of the intensity monitor, and statistical fluctuation introduced by finite data size. Our simulation results show that the passive estimate of an untrusted source remains useful in practice, despite these four imperfections. Also, we have performed preliminary experiments, confirming the utility of our proposal in real-life applications. Our proposal makes it possible to implement the 'plug and play' QKD with the security guaranteed, while keeping the implementation practical.

  8. Rapid development of medical imaging tools with open-source libraries.

    Science.gov (United States)

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  9. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  10. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  11. The Common Technique for Analyzing the Financial Results Report

    Directory of Open Access Journals (Sweden)

    Pasternak Maria M.

    2017-04-01

    Full Text Available The article is aimed at generalizing the theoretical approaches to the structure and elements of the technique for analysis of the Financial results report (Cumulative income report and providing suggestions for its improvement. The current methods have been analyzed, relevance of the application of a common technique for such analysis has been substantiated. A common technique for analyzing the Financial results report has been proposed, which includes definition of the objectives and tasks of analysis, subjects and objects of analysis, sources of its information. Stages of such an analysis were allocated and described. The findings of the article can be used to theoretically substantiate and to practically develop a technique for analyzing the Financial results report in the branches of Ukrainian economy.

  12. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  13. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  14. swLORETA: a novel approach to robust source localization and synchronization tomography

    International Nuclear Information System (INIS)

    Palmero-Soler, Ernesto; Dolan, Kevin; Hadamschek, Volker; Tass, Peter A

    2007-01-01

    Standardized low-resolution brain electromagnetic tomography (sLORETA) is a widely used technique for source localization. However, this technique still has some limitations, especially under realistic noisy conditions and in the case of deep sources. To overcome these problems, we present here swLORETA, an improved version of sLORETA, obtained by incorporating a singular value decomposition-based lead field weighting. We show that the precision of the source localization can further be improved by a tomographic phase synchronization analysis based on swLORETA. The phase synchronization analysis turns out to be superior to a standard linear coherence analysis, since the latter cannot distinguish between real phase locking and signal mixing

  15. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  16. Laser Scanning Systems and Techniques in Rockfall Source Identification and Risk Assessment: A Critical Review

    Science.gov (United States)

    Fanos, Ali Mutar; Pradhan, Biswajeet

    2018-04-01

    Rockfall poses risk to people, their properties and to transportation ways in mountainous and hilly regions. This catastrophe shows various characteristics such as vast distribution, sudden occurrence, variable magnitude, strong fatalness and randomicity. Therefore, prediction of rockfall phenomenon both spatially and temporally is a challenging task. Digital Terrain model (DTM) is one of the most significant elements in rockfall source identification and risk assessment. Light detection and ranging (LiDAR) is the most advanced effective technique to derive high-resolution and accurate DTM. This paper presents a critical overview of rockfall phenomenon (definition, triggering factors, motion modes and modeling) and LiDAR technique in terms of data pre-processing, DTM generation and the factors that can be obtained from this technique for rockfall source identification and risk assessment. It also reviews the existing methods that are utilized for the evaluation of the rockfall trajectories and their characteristics (frequency, velocity, bouncing height and kinetic energy), probability, susceptibility, hazard and risk. Detail consideration is given on quantitative methodologies in addition to the qualitative ones. Various methods are demonstrated with respect to their application scales (local and regional). Additionally, attention is given to the latest improvement, particularly including the consideration of the intensity of the phenomena and the magnitude of the events at chosen sites.

  17. Source modelling in seismic risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Yucemen, M.S.

    1978-12-01

    The proposed probabilistic procedure provides a consistent method for the modelling, analysis and updating of uncertainties that are involved in the seismic risk analysis for nuclear power plants. The potential earthquake activity zones are idealized as point, line or area sources. For these seismic source types, expressions to evaluate their contribution to seismic risk are derived, considering all the possible site-source configurations. The seismic risk at a site is found to depend not only on the inherent randomness of the earthquake occurrences with respect to magnitude, time and space, but also on the uncertainties associated with the predicted values of the seismic and geometric parameters, as well as the uncertainty in the attenuation model. The uncertainty due to the attenuation equation is incorporated into the analysis through the use of random correction factors. The influence of the uncertainty resulting from the insufficient information on the seismic parameters and source geometry is introduced into the analysis by computing a mean risk curve averaged over the various alternative assumptions on the parameters and source geometry. Seismic risk analysis is carried for the city of Denizli, which is located in the seismically most active zone of Turkey. The second analysis is for Akkuyu

  18. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  19. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  20. A Versatile Integrated Ambient Ionization Source Platform

    Science.gov (United States)

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-01

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.

  1. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  2. A new technique for thick source alpha counting determination of U and Th

    CERN Document Server

    Michael, C T

    2000-01-01

    A new technique for the calculation of U and Th concentration is presented based on the alpha particle spectrum taken from a thick sample by using a silicon detector. Four approaches to the analysis of the experimental data are presented, one being an improvement on the known pairs technique. By the proposed technique it is possible to calculate the concentrations of certain daughter nuclides in the two series, or the sum of the activity concentrations of others. This allows the detection of secular disequilibrium in our samples. This technique also has the advantage of being more accurate and provides the opportunity to cross-check the results derived from the different approaches.

  3. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  4. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  5. Noise source identification for ducted fan systems

    OpenAIRE

    BENNETT, GARETH; FITZPATRICK, JOHN AIDAN

    2008-01-01

    PUBLISHED Coherence based source analysis techniques can be used to identify the contribution of combustion noise in the exhaust of a jet engine and hence enable the design of noise reduction devices. However, when the combustion noise propagates in a non-linear fashion the identified contribution using ordinary coherence methods will be inaccurate. In this paper, an analysis technique to enable the contribution of linear and non-linear mechanisms to the propagated sound ...

  6. Optimization of H.E.S.S. instrumental performances for the analysis of weak gamma-ray sources: Application to the study of HESS J1832-092

    International Nuclear Information System (INIS)

    Laffon, H.

    2012-01-01

    H.E.S.S. (High Energy Stereoscopic System) is an array of very-high energy gamma-ray telescopes located in Namibia. These telescopes take advantage of the atmospheric Cherenkov technique using stereoscopy, allowing to detect gamma-rays between 100 GeV and a few tens of TeV. The location of the H.E.S.S. telescopes in the Southern hemisphere allows to observe the central parts of our galaxy, the Milky Way. Tens of new gamma-ray sources were thereby discovered thanks to the galactic plane survey strategy. After ten years of fruitful observations with many detections, it is now necessary to improve the detector performance in order to detect new sources by increasing the sensitivity and improving the angular resolution. The aim of this thesis consists in the development of advanced analysis techniques allowing to make sharper analysis. An automatic tool to look for new sources and to improve the subtraction of the background noise is presented. It is optimized for the study of weak sources that needs a very rigorous analysis. A combined reconstruction method is built in order to improve the angular resolution without reducing the statistics, which is critical for weak sources. These advanced methods are applied to the analysis of a complex region of the galactic plane near the supernova remnant G22.7-0.2, leading to the detection of a new source, HESS J1832-092. Multi-wavelength counterparts are shown and several scenarios are considered to explain the origin of the gamma-ray signal of this astrophysical object. (author)

  7. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  8. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  9. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  10. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  11. CHEMICAL PROFILES OF HONEYS ORIGINATING FROM DIFFERENT FLORAL SOURCES AND GEOGRAPHIC LOCATIONS EXAMINED BY A COMBINATION OF THREE EXTRACTION AND ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    D. M. Meloncelli

    2015-05-01

    Full Text Available The chemical profiles of Tasmanian Leatherwood and Manuka honeys from Tasmania and New Zealand have been compared by a combination of GC-MS analysis of volatiles and semi-volatiles, RP-HPLC-DAD analysis of phenolics and flavonoids and HPLC-DAD analysis of derivatised dihydroxyacetone, hydroxymethylfurfural and methylglyoxal. This study found that Tasmanian and New Zealand Manuka honeys have high concentrations of methylglyoxal. However, syringic acid was only detected in Manuka honeys grown in New Zealand. The Tasmanian honeys can be distinguished by the higher concentration of 3-phenyllactic acid in Manuka compared to Leatherwood floral sources.

  12. CHEMICAL PROFILES OF HONEYS ORIGINATING FROM DIFFERENT FLORAL SOURCES AND GEOGRAPHIC LOCATIONS EXAMINED BY A COMBINATION OF THREE EXTRACTION AND ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    D. M. Meloncelli,

    2015-02-01

    Full Text Available The chemical profiles of Tasmanian Leatherwood and Manuka honeys from Tasmania and New Zealand have been compared by a combination of GC-MS analysis of volatiles and semi-volatiles, RP-HPLC-DAD analysis of phenolics and flavonoids and HPLC-DAD analysis of derivatised dihydroxyacetone, hydroxymethylfurfural and methylglyoxal. This study found that Tasmanian and New Zealand Manuka honeys have high concentrations of methylglyoxal. However, syringic acid was only detected in Manuka honeys grown in New Zealand. The Tasmanian honeys can be distinguished by the higher concentration of 3-phenyllactic acid in Manuka compared to Leatherwood floral sources.

  13. Mechanistic facility safety and source term analysis

    International Nuclear Information System (INIS)

    PLYS, M.G.

    1999-01-01

    A PC-based computer program was created for facility safety and source term analysis at Hanford The program has been successfully applied to mechanistic prediction of source terms from chemical reactions in underground storage tanks, hydrogen combustion in double contained receiver tanks, and proccss evaluation including the potential for runaway reactions in spent nuclear fuel processing. Model features include user-defined facility room, flow path geometry, and heat conductors, user-defined non-ideal vapor and aerosol species, pressure- and density-driven gas flows, aerosol transport and deposition, and structure to accommodate facility-specific source terms. Example applications are presented here

  14. Chemical analysis by nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system.

  15. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y.

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  16. Characterization of Deficiencies in the Frequency Domain Forced Response Analysis Technique for Supersonic Turbine Bladed Disks

    Science.gov (United States)

    Brown, Andrew M.; Schmauch, Preston

    2011-01-01

    Turbine blades in rocket and jet engine turbomachinery experience enormous harmonic loading conditions. These loads result from the integer number of upstream and downstream stator vanes as well as the other turbine stages. Assessing the blade structural integrity is a complex task requiring an initial characterization of whether resonance is possible and then performing a forced response analysis if that condition is met. The standard technique for forced response analysis in rocket engines is to decompose a CFD-generated flow field into its harmonic components, and to then perform a frequency response analysis at the problematic natural frequencies. Recent CFD analysis and water-flow testing at NASA/MSFC, though, indicates that this technique may miss substantial harmonic and non-harmonic excitation sources that become present in complex flows. A substantial effort has been made to account for this denser spatial Fourier content in frequency response analysis (described in another paper by the author), but the question still remains whether the frequency response analysis itself is capable of capturing the excitation content sufficiently. Two studies comparing frequency response analysis with transient response analysis, therefore, of bladed-disks undergoing this complex flow environment have been performed. The first is of a bladed disk with each blade modeled by simple beam elements. Six loading cases were generated by varying a baseline harmonic excitation in different ways based upon cold-flow testing from Heritage Fuel Air Turbine Test. It was hypothesized that the randomness and other variation from the standard harmonic excitation would reduce the blade structural response, but the results showed little reduction. The second study was of a realistic model of a bladed-disk excited by the same CFD used in the J2X engine program. It was hypothesized that enforcing periodicity in the CFD (inherent in the frequency response technique) would overestimate the

  17. OSSMETER D3.4 – Language-Specific Source Code Quality Analysis

    NARCIS (Netherlands)

    J.J. Vinju (Jurgen); A. Shahi (Ashim); H.J.S. Basten (Bas)

    2014-01-01

    htmlabstractThis deliverable is part of WP3: Source Code Quality and Activity Analysis. It provides descriptions and prototypes of the tools that are needed for source code quality analysis in open source software projects. It builds upon the results of: • Deliverable 3.1 where infra-structure and

  18. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  19. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  20. Galvanically Isolated Quasi-Z-Source DC–DC Converter With a Novel ZVS and ZCS Technique

    DEFF Research Database (Denmark)

    Husev, Oleksandr; Liivik, Liisa; Blaabjerg, Frede

    2015-01-01

    This paper focuses on the galvanically isolated quasi-Z-source dc-dc converter with a novel zero voltage and zero current switching technique. The unique feature of the impedance network lies in combining the buck-boost operation capability with the short- and open-circuit immunity of transistors......; at the same time, it can perform zero voltage and zero current switching on the primary side. The boundary conduction mode of the current in the second inductor of the quasi-Z-source network was used along with snubber capacitors in the two out of four transistors and a special control algorithm to achieve...

  1. Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection

    Science.gov (United States)

    2015-12-01

    some occasions, performance is terminated early; this can occur due to either mutual agreement or a breach of contract by one of the parties (Garrett...Relationship of Source Selection Methods to Contract Outcomes: an Analysis of Air Force Source Selection December 2015 Capt Jacques Lamoureux, USAF...on the contract management process, with special emphasis on the source selection methods of tradeoff and lowest price technically acceptable (LPTA

  2. Determination of volatile organic compounds pollution sources in malaysian drinking water using multivariate analysis.

    Science.gov (United States)

    Soh, Shiau-Chian; Abdullah, Md Pauzi

    2007-01-01

    A field investigation was conducted at all water treatment plants throughout 11 states and Federal Territory in Peninsular Malaysia. The sampling points in this study include treatment plant operation, service reservoir outlet and auxiliary outlet point at the water pipelines. Analysis was performed by solid phase micro-extraction technique with a 100 microm polydimethylsiloxane fibre using gas chromatography with mass spectrometry detection to analyse 54 volatile organic compounds (VOCs) of different chemical families in drinking water. The concentration of VOCs ranged from undetectable to 230.2 microg/l. Among all of the VOCs species, chloroform has the highest concentration and was detected in all drinking water samples. Average concentrations of total trihalomethanes (THMs) were almost similar among all states which were in the range of 28.4--33.0 microg/l. Apart from THMs, other abundant compounds detected were cis and trans-1,2-dichloroethylene, trichloroethylene, 1,2-dibromoethane, benzene, toluene, ethylbenzene, chlorobenzene, 1,4-dichlorobenzene and 1,2-dichloro - benzene. Principal component analysis (PCA) with the aid of varimax rotation, and parallel factor analysis (PARAFAC) method were used to statistically verify the correlation between VOCs and the source of pollution. The multivariate analysis pointed out that the maintenance of auxiliary pipelines in the distribution systems is vital as it can become significant point source pollution to Malaysian drinking water.

  3. Novel technique for addressing streak artifact in gated dual-source MDCT angiography utilizing ECG-editing

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Laura T.; Boll, Daniel T. [Duke University Medical Center, Department of Radiology, Box 3808, Durham, NC (United States)

    2008-11-15

    Streak artifact is an important source of image degradation in computed tomographic imaging. In coronary MDCT angiography, streak artifact from pacemaker leads in the SVC can render segments of the right coronary artery uninterpretable. With current technology in clinical practice, there is no effective way to eliminate streak artifact in coronary MDCT angiography entirely. We propose a technique to minimize the impact of streak artifact in retrospectively gated coronary MDCT angiography by utilizing small shifts in the reconstruction window. In our experience, previously degraded portions of the coronary vasculature were able to be well evaluated using this technique. (orig.)

  4. Use of the Drawing-Writing Technique to Determine the Level of Knowledge of Pre-Service Teachers Regarding Renewable Energy Sources

    Science.gov (United States)

    Kara, Filiz

    2015-01-01

    The aim of this study was to determine the level of knowledge of pre-service science teachers in Turkey regarding the different types of renewable energy sources, the methods used for obtaining energy from these sources, and the areas of use for these energy sources. Within the context of the study, the drawing-writing technique was used in order…

  5. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  6. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  7. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  8. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  9. Characterization techniques for the high-brightness particle beams of the Advanced Photon Source (APS)

    International Nuclear Information System (INIS)

    Lumpkin, A.H.

    1993-01-01

    The Advanced Photon Source (APS) will be a third-generation synchrotron radiation (SR) user facility in the hard x-ray regime (10--100 keV). The design objectives for the 7-GeV storage ring include a positron beam natural emittance of 8 x 10 -9 m-rad at an average current of 100 mA. Proposed methods for measuring the transverse and longitudinal profiles will be described. Additionally, a research and development effort using an rf gun as a low-emittance source of electrons for injection into the 200- to 650-MeV linac subsystem is underway. This latter system is projected to produce electron beams with a normalized, rms emittance of ∼2 π mm-mrad at peak currents of near one hundred amps. This interesting characterization problem will also be briefly discussed. The combination of both source types within one laboratory facility will stimulate the development of diagnostic techniques in these parameter spaces

  10. Chemometric techniques in distribution, characterisation and source apportionment of polycyclic aromatic hydrocarbons (PAHS) in aquaculture sediments in Malaysia.

    Science.gov (United States)

    Retnam, Ananthy; Zakaria, Mohamad Pauzi; Juahir, Hafizan; Aris, Ahmad Zaharin; Zali, Munirah Abdul; Kasim, Mohd Fadhil

    2013-04-15

    This study investigated polycyclic aromatic hydrocarbons (PAHs) pollution in surface sediments within aquaculture areas in Peninsular Malaysia using chemometric techniques, forensics and univariate methods. The samples were analysed using soxhlet extraction, silica gel column clean-up and gas chromatography mass spectrometry. The total PAH concentrations ranged from 20 to 1841 ng/g with a mean of 363 ng/g dw. The application of chemometric techniques enabled clustering and discrimination of the aquaculture sediments into four groups according to the contamination levels. A combination of chemometric and molecular indices was used to identify the sources of PAHs, which could be attributed to vehicle emissions, oil combustion and biomass combustion. Source apportionment using absolute principle component scores-multiple linear regression showed that the main sources of PAHs are vehicle emissions 54%, oil 37% and biomass combustion 9%. Land-based pollution from vehicle emissions is the predominant contributor of PAHs in the aquaculture sediments of Peninsular Malaysia. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  12. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  13. Fissile mass estimation by pulsed neutron source interrogation

    Energy Technology Data Exchange (ETDEWEB)

    Israelashvili, I., E-mail: israelashvili@gmail.com [Nuclear Research Center of the Negev, P.O.B 9001, Beer Sheva 84190 (Israel); Dubi, C.; Ettedgui, H.; Ocherashvili, A. [Nuclear Research Center of the Negev, P.O.B 9001, Beer Sheva 84190 (Israel); Pedersen, B. [Nuclear Security Unit, Institute for Transuranium Elements, Joint Research Centre, Via E. Fermi, 2749, 21027 Ispra (Italy); Beck, A. [Nuclear Research Center of the Negev, P.O.B 9001, Beer Sheva 84190 (Israel); Roesgen, E.; Crochmore, J.M. [Nuclear Security Unit, Institute for Transuranium Elements, Joint Research Centre, Via E. Fermi, 2749, 21027 Ispra (Italy); Ridnik, T.; Yaar, I. [Nuclear Research Center of the Negev, P.O.B 9001, Beer Sheva 84190 (Israel)

    2015-06-11

    Passive methods for detecting correlated neutrons from spontaneous fissions (e.g. multiplicity and SVM) are widely used for fissile mass estimations. These methods can be used for fissile materials that emit a significant amount of fission neutrons (like plutonium). Active interrogation, in which fissions are induced in the tested material by an external continuous source or by a pulsed neutron source, has the potential advantages of fast measurement, alongside independence of the spontaneous fissions of the tested fissile material, thus enabling uranium measurement. Until recently, using the multiplicity method, for uranium mass estimation, was possible only for active interrogation made with continues neutron source. Pulsed active neutron interrogation measurements were analyzed with techniques, e.g. differential die away analysis (DDA), which ignore or implicitly include the multiplicity effect (self-induced fission chains). Recently, both, the multiplicity and the SVM techniques, were theoretically extended for analyzing active fissile mass measurements, made by a pulsed neutron source. In this study the SVM technique for pulsed neutron source is experimentally examined, for the first time. The measurements were conducted at the PUNITA facility of the Joint Research Centre in Ispra, Italy. First promising results, of mass estimation by the SVM technique using a pulsed neutron source, are presented.

  14. Polarisation modulated crosscorrelation spectroscopy on a pulsed neutron source

    International Nuclear Information System (INIS)

    Cywinski, R.; Williams, W.G.

    1984-07-01

    A crosscorrelation technique is introduced by which a total scattering polarisation analysis spectrometer on a pulsed neutron source can be modified to give full neutron polarisation and energy analysis without changing the physical configuration of the instrument. Its implementation on the proposed POLARIS spectrometer at the Rutherford Appleton Laboratory Spallation Neutron Source is described, and the expected dynamic (Q, ω) range and resolution evaluated. (author)

  15. Nuclear techniques and cross-correlation methods for spectral analysis in two-phase flow measurements in mineral pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Brandao, Luis E.B.; Salgado, Cesar M., E-mail: brandaos@ien.gov.br, E-mail: otero@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Divisao de Radiofarmacos; Sicilliano, Umberto C.C.S., E-mail: umberto.cassara@poli.ufrj.br [Universidade Federal do Rio de Janeiro (UFRJ), Rio de Janeiro, RJ (Brazil). Dept. de Metalurgia

    2013-07-01

    In mineral industry is common to use water to transport pellets inside pipes. In these units, the correct measurement of flow (both solid and liquid phase) is important to guarantee a safe operation. Cross correlation flow meters are devices specially suited to be used in dual-phase flow and they are based on measure the transit time due the disturbances registered between two points, in our case gamma attenuation from radioactive sources. The emphasis of this work is the application of gamma transmission and scattering technique associated with spectral analysis methods to measure the flow of solid phase in a liquid fluid in side the pipe. The detectors and the sources are out side of the tube and are positioned 10.0 cm distant one from the other. The photons of transmission/scattering gamma radiation were registered, and across-correlation method was applied to measure the flow and spectral analysis was used to study the flow profile inside the pipe. (author)

  16. Comparative analysis of traditional and alternative energy sources

    Directory of Open Access Journals (Sweden)

    Adriana Csikósová

    2008-11-01

    Full Text Available The presented thesis with designation of Comparing analysis of traditional and alternative energy resources includes, on basisof theoretical information source, research in firm, internal data, trends in company development and market, descriptionof the problem and its application. Theoretical information source is dedicated to the traditional and alternative energy resources,reserves of it, trends in using and development, the balance of it in the world, EU and in Slovakia as well. Analysis of the thesisis reflecting profile of the company and the thermal pump market evaluation using General Electric method. While the companyis implementing, except other products, the thermal pumps on geothermal energy base and surround energy base (air, the missionof the comparing analysis is to compare traditional energy resources with thermal pump from the ecological, utility and economic sideof it. The results of the comparing analysis are resumed in to the SWOT analysis. The part of the thesis includes he questionnaire offerfor effectiveness improvement and customer satisfaction analysis, and expected possibilities of alternative energy resources assistance(benefits from the government and EU funds.

  17. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  18. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  19. Source-system windowing for speech analysis

    NARCIS (Netherlands)

    Yegnanarayana, B.; Satyanarayana Murthy, P.; Eggen, J.H.

    1993-01-01

    In this paper we propose a speech-analysis method to bring out characteristics of the vocal tract system in short segments which are much less than a pitch period. The method performs windowing in the source and system components of the speech signal and recombines them to obtain a signal reflecting

  20. Detecting people of interest from internet data sources

    Science.gov (United States)

    Cardillo, Raymond A.; Salerno, John J.

    2006-04-01

    In previous papers, we have documented success in determining the key people of interest from a large corpus of real-world evidence. Our recent efforts focus on exploring additional domains and data sources. Internet data sources such as email, web pages, and news feeds make it easier to gather a large corpus of documents for various domains, but detecting people of interest in these sources introduces new challenges. Analyzing these massive sources magnifies entity resolution problems, and demands a storage management strategy that supports efficient algorithmic analysis and visualization techniques. This paper discusses the techniques we used in order to analyze the ENRON email repository, which are also applicable to analyzing web pages returned from our "Buddy" meta-search engine.

  1. Identification of Sparse Audio Tampering Using Distributed Source Coding and Compressive Sensing Techniques

    Directory of Open Access Journals (Sweden)

    Valenzise G

    2009-01-01

    Full Text Available In the past few years, a large amount of techniques have been proposed to identify whether a multimedia content has been illegally tampered or not. Nevertheless, very few efforts have been devoted to identifying which kind of attack has been carried out, especially due to the large data required for this task. We propose a novel hashing scheme which exploits the paradigms of compressive sensing and distributed source coding to generate a compact hash signature, and we apply it to the case of audio content protection. The audio content provider produces a small hash signature by computing a limited number of random projections of a perceptual, time-frequency representation of the original audio stream; the audio hash is given by the syndrome bits of an LDPC code applied to the projections. At the content user side, the hash is decoded using distributed source coding tools. If the tampering is sparsifiable or compressible in some orthonormal basis or redundant dictionary, it is possible to identify the time-frequency position of the attack, with a hash size as small as 200 bits/second; the bit saving obtained by introducing distributed source coding ranges between 20% to 70%.

  2. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  3. Nonlinear analysis techniques of block masonry walls in nuclear power plants

    International Nuclear Information System (INIS)

    Hamid, A.A.; Harris, H.G.

    1986-01-01

    Concrete masonry walls have been used extensively in nuclear power plants as non-load bearing partitions serving as pipe supports, fire walls, radiation shielding barriers, and similar heavy construction separations. When subjected to earthquake loads, these walls should maintain their structural integrity. However, some of the walls do not meet design requirements based on working stress allowables. Consequently, utilities have used non-linear analysis techniques, such as the arching theory and the energy balance technique, to qualify such walls. This paper presents a critical review of the applicability of non-linear analysis techniques for both unreinforced and reinforced block masonry walls under seismic loading. These techniques are critically assessed in light of the performance of walls from limited available test data. It is concluded that additional test data are needed to justify the use of nonlinear analysis techniques to qualify block walls in nuclear power plants. (orig.)

  4. Authentication techniques for smart cards

    International Nuclear Information System (INIS)

    Nelson, R.A.

    1994-02-01

    Smart card systems are most cost efficient when implemented as a distributed system, which is a system without central host interaction or a local database of card numbers for verifying transaction approval. A distributed system, as such, presents special card and user authentication problems. Fortunately, smart cards offer processing capabilities that provide solutions to authentication problems, provided the system is designed with proper data integrity measures. Smart card systems maintain data integrity through a security design that controls data sources and limits data changes. A good security design is usually a result of a system analysis that provides a thorough understanding of the application needs. Once designers understand the application, they may specify authentication techniques that mitigate the risk of system compromise or failure. Current authentication techniques include cryptography, passwords, challenge/response protocols, and biometrics. The security design includes these techniques to help prevent counterfeit cards, unauthorized use, or information compromise. This paper discusses card authentication and user identity techniques that enhance security for microprocessor card systems. It also describes the analysis process used for determining proper authentication techniques for a system

  5. Nuclear techniques in marine metal exploration

    International Nuclear Information System (INIS)

    Michaelis, W.

    1979-01-01

    The growing concern about the future availability of raw materials has increasingly drawn attention to the extensive marine metalliferous mineral deposits. Nuclear techniques can provide powerful analytical tools for exploring these resources. The measurement of natural gamma radiation, X-ray fluorescence analysis and a variety of neutron techniques based on 252 Cf, (α,n) and (d,n) sources are now in use or appear to make progress. Improvement of the relevant cross sections could considerably advance the technical development both in the field and in the laboratory. Particular consideration should be given to a number of energy-dependent cross sections pertaining to neutron and gamma transport in field application of activation analysis or radiative capture, to neutron cross sections for production of gamma rays from inelastic collisions, to cross sections of threshold reactions which either ensure elemental selectivity or are the source of elemental interferences and, finally, to cross sections for quasi-prompt activation with 14 MeV neutrons. (orig.) [de

  6. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  7. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  8. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  9. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  10. Magnetic separation techniques in sample preparation for biological analysis: a review.

    Science.gov (United States)

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Locating non-volcanic tremor along the San Andreas Fault using a multiple array source imaging technique

    Science.gov (United States)

    Ryberg, T.; Haberland, C.H.; Fuis, G.S.; Ellsworth, W.L.; Shelly, D.R.

    2010-01-01

    Non-volcanic tremor (NVT) has been observed at several subduction zones and at the San Andreas Fault (SAF). Tremor locations are commonly derived by cross-correlating envelope-transformed seismic traces in combination with source-scanning techniques. Recently, they have also been located by using relative relocations with master events, that is low-frequency earthquakes that are part of the tremor; locations are derived by conventional traveltime-based methods. Here we present a method to locate the sources of NVT using an imaging approach for multiple array data. The performance of the method is checked with synthetic tests and the relocation of earthquakes. We also applied the method to tremor occurring near Cholame, California. A set of small-aperture arrays (i.e. an array consisting of arrays) installed around Cholame provided the data set for this study. We observed several tremor episodes and located tremor sources in the vicinity of SAF. During individual tremor episodes, we observed a systematic change of source location, indicating rapid migration of the tremor source along SAF. ?? 2010 The Authors Geophysical Journal International ?? 2010 RAS.

  12. Antioxidants: Characterization, natural sources, extraction and analysis.

    Science.gov (United States)

    Oroian, Mircea; Escriche, Isabel

    2015-08-01

    Recently many review papers regarding antioxidants from different sources and different extraction and quantification procedures have been published. However none of them has all the information regarding antioxidants (chemistry, sources, extraction and quantification). This article tries to take a different perspective on antioxidants for the new researcher involved in this field. Antioxidants from fruit, vegetables and beverages play an important role in human health, for example preventing cancer and cardiovascular diseases, and lowering the incidence of different diseases. In this paper the main classes of antioxidants are presented: vitamins, carotenoids and polyphenols. Recently, many analytical methodologies involving diverse instrumental techniques have been developed for the extraction, separation, identification and quantification of these compounds. Antioxidants have been quantified by different researchers using one or more of these methods: in vivo, in vitro, electrochemical, chemiluminescent, electron spin resonance, chromatography, capillary electrophoresis, nuclear magnetic resonance, near infrared spectroscopy and mass spectrometry methods. Copyright © 2015. Published by Elsevier Ltd.

  13. Performance analysis of clustering techniques over microarray data: A case study

    Science.gov (United States)

    Dash, Rasmita; Misra, Bijan Bihari

    2018-03-01

    Handling big data is one of the major issues in the field of statistical data analysis. In such investigation cluster analysis plays a vital role to deal with the large scale data. There are many clustering techniques with different cluster analysis approach. But which approach suits a particular dataset is difficult to predict. To deal with this problem a grading approach is introduced over many clustering techniques to identify a stable technique. But the grading approach depends on the characteristic of dataset as well as on the validity indices. So a two stage grading approach is implemented. In this study the grading approach is implemented over five clustering techniques like hybrid swarm based clustering (HSC), k-means, partitioning around medoids (PAM), vector quantization (VQ) and agglomerative nesting (AGNES). The experimentation is conducted over five microarray datasets with seven validity indices. The finding of grading approach that a cluster technique is significant is also established by Nemenyi post-hoc hypothetical test.

  14. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  15. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  16. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  17. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  18. Analysis technique for controlling system wavefront error with active/adaptive optics

    Science.gov (United States)

    Genberg, Victor L.; Michels, Gregory J.

    2017-08-01

    The ultimate goal of an active mirror system is to control system level wavefront error (WFE). In the past, the use of this technique was limited by the difficulty of obtaining a linear optics model. In this paper, an automated method for controlling system level WFE using a linear optics model is presented. An error estimate is included in the analysis output for both surface error disturbance fitting and actuator influence function fitting. To control adaptive optics, the technique has been extended to write system WFE in state space matrix form. The technique is demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  19. MEG source imaging method using fast L1 minimum-norm and its applications to signals with brain noise and human resting-state source amplitude images.

    Science.gov (United States)

    Huang, Ming-Xiong; Huang, Charles W; Robb, Ashley; Angeles, AnneMarie; Nichols, Sharon L; Baker, Dewleen G; Song, Tao; Harrington, Deborah L; Theilmann, Rebecca J; Srinivasan, Ramesh; Heister, David; Diwakar, Mithun; Canive, Jose M; Edgar, J Christopher; Chen, Yu-Han; Ji, Zhengwei; Shen, Max; El-Gabalawy, Fady; Levy, Michael; McLay, Robert; Webb-Murphy, Jennifer; Liu, Thomas T; Drake, Angela; Lee, Roland R

    2014-01-01

    The present study developed a fast MEG source imaging technique based on Fast Vector-based Spatio-Temporal Analysis using a L1-minimum-norm (Fast-VESTAL) and then used the method to obtain the source amplitude images of resting-state magnetoencephalography (MEG) signals for different frequency bands. The Fast-VESTAL technique consists of two steps. First, L1-minimum-norm MEG source images were obtained for the dominant spatial modes of sensor-waveform covariance matrix. Next, accurate source time-courses with millisecond temporal resolution were obtained using an inverse operator constructed from the spatial source images of Step 1. Using simulations, Fast-VESTAL's performance was assessed for its 1) ability to localize multiple correlated sources; 2) ability to faithfully recover source time-courses; 3) robustness to different SNR conditions including SNR with negative dB levels; 4) capability to handle correlated brain noise; and 5) statistical maps of MEG source images. An objective pre-whitening method was also developed and integrated with Fast-VESTAL to remove correlated brain noise. Fast-VESTAL's performance was then examined in the analysis of human median-nerve MEG responses. The results demonstrated that this method easily distinguished sources in the entire somatosensory network. Next, Fast-VESTAL was applied to obtain the first whole-head MEG source-amplitude images from resting-state signals in 41 healthy control subjects, for all standard frequency bands. Comparisons between resting-state MEG sources images and known neurophysiology were provided. Additionally, in simulations and cases with MEG human responses, the results obtained from using conventional beamformer technique were compared with those from Fast-VESTAL, which highlighted the beamformer's problems of signal leaking and distorted source time-courses. © 2013.

  20. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  1. Forensic analysis of explosives using isotope ratio mass spectrometry (IRMS)--discrimination of ammonium nitrate sources.

    Science.gov (United States)

    Benson, Sarah J; Lennard, Christopher J; Maynard, Philip; Hill, David M; Andrew, Anita S; Roux, Claude

    2009-06-01

    An evaluation was undertaken to determine if isotope ratio mass spectrometry (IRMS) could assist in the investigation of complex forensic cases by providing a level of discrimination not achievable utilising traditional forensic techniques. The focus of the research was on ammonium nitrate (AN), a common oxidiser used in improvised explosive mixtures. The potential value of IRMS to attribute Australian AN samples to the manufacturing source was demonstrated through the development of a preliminary AN classification scheme based on nitrogen isotopes. Although the discrimination utilising nitrogen isotopes alone was limited and only relevant to samples from the three Australian manufacturers during the evaluated time period, the classification scheme has potential as an investigative aid. Combining oxygen and hydrogen stable isotope values permitted the differentiation of AN prills from three different Australian manufacturers. Samples from five different overseas sources could be differentiated utilising a combination of the nitrogen, oxygen and hydrogen isotope values. Limited differentiation between Australian and overseas prills was achieved for the samples analysed. The comparison of nitrogen isotope values from intact AN prill samples with those from post-blast AN prill residues highlighted that the nitrogen isotopic composition of the prills was not maintained post-blast; hence, limiting the technique to analysis of un-reacted explosive material.

  2. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  3. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  4. Preconditioned conjugate gradient technique for the analysis of symmetric anisotropic structures

    Science.gov (United States)

    Noor, Ahmed K.; Peters, Jeanne M.

    1987-01-01

    An efficient preconditioned conjugate gradient (PCG) technique and a computational procedure are presented for the analysis of symmetric anisotropic structures. The technique is based on selecting the preconditioning matrix as the orthotropic part of the global stiffness matrix of the structure, with all the nonorthotropic terms set equal to zero. This particular choice of the preconditioning matrix results in reducing the size of the analysis model of the anisotropic structure to that of the corresponding orthotropic structure. The similarities between the proposed PCG technique and a reduction technique previously presented by the authors are identified and exploited to generate from the PCG technique direct measures for the sensitivity of the different response quantities to the nonorthotropic (anisotropic) material coefficients of the structure. The effectiveness of the PCG technique is demonstrated by means of a numerical example of an anisotropic cylindrical panel.

  5. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  6. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  7. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    Science.gov (United States)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  8. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  9. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  10. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  11. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  12. Intra-urban biomonitoring: Source apportionment using tree barks to identify air pollution sources.

    Science.gov (United States)

    Moreira, Tiana Carla Lopes; de Oliveira, Regiani Carvalho; Amato, Luís Fernando Lourenço; Kang, Choong-Min; Saldiva, Paulo Hilário Nascimento; Saiki, Mitiko

    2016-05-01

    It is of great interest to evaluate if there is a relationship between possible sources and trace elements using biomonitoring techniques. In this study, tree bark samples of 171 trees were collected using a biomonitoring technique in the inner city of São Paulo. The trace elements (Al, Ba, Ca, Cl, Cu, Fe, K, Mg, Mn, Na, P, Rb, S, Sr and Zn) were determined by the energy dispersive X-ray fluorescence (EDXRF) spectrometry. The Principal Component Analysis (PCA) was applied to identify the plausible sources associated with tree bark measurements. The greatest source was vehicle-induced non-tailpipe emissions derived mainly from brakes and tires wear-out and road dust resuspension (characterized with Al, Ba, Cu, Fe, Mn and Zn), which was explained by 27.1% of the variance, followed by cement (14.8%), sea salt (11.6%) and biomass burning (10%), and fossil fuel combustion (9.8%). We also verified that the elements related to vehicular emission showed different concentrations at different sites of the same street, which might be helpful for a new street classification according to the emission source. The spatial distribution maps of element concentrations were obtained to evaluate the different levels of pollution in streets and avenues. Results indicated that biomonitoring techniques using tree bark can be applied to evaluate dispersion of air pollution and provide reliable data for the further epidemiological studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Analysis of the tuning characteristics of microwave plasma source

    International Nuclear Information System (INIS)

    Miotk, Robert; Jasiński, Mariusz; Mizeraczyk, Jerzy

    2016-01-01

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n_e and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n_e and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  14. How Many Separable Sources? Model Selection In Independent Components Analysis

    Science.gov (United States)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  15. MorphoTester: An Open Source Application for Morphological Topographic Analysis.

    Directory of Open Access Journals (Sweden)

    Julia M Winchester

    Full Text Available The increased prevalence and affordability of 3D scanning technology is beginning to have significant effects on the research questions and approaches available for studies of morphology. As the current trend of larger and more precise 3D datasets is unlikely to slow in the future, there is a need for efficient and capable tools for high-throughput quantitative analysis of biological shape. The promise and the challenge of implementing relatively automated methods for characterizing surface shape can be seen in the example of dental topographic analysis. Dental topographic analysis comprises a suite of techniques for quantifying tooth surfaces and component features. Topographic techniques have provided insight on mammalian molar form-function relationships and these methods could be applied to address other topics and questions. At the same time implementing multiple complementary topographic methods can have high time and labor costs, and comparability of data formats and approaches is difficult to predict. To address these challenges I present MorphoTester, an open source application for visualizing and quantifying topography from 3D triangulated polygon meshes. This application is Python-based and is free to use. MorphoTester implements three commonly used dental topographic metrics-Dirichlet normal energy, relief index, and orientation patch count rotated (OPCR. Previous OPCR algorithms have used raster-based grid data, which is not directly interchangeable with vector-based triangulated polygon meshes. A 3D-OPCR algorithm is provided here for quantifying complexity from polygon meshes. The efficacy of this metric is tested in a sample of mandibular second molars belonging to four species of cercopithecoid primates. Results suggest that 3D-OPCR is at least as effective for quantifying complexity as previous approaches, and may be more effective due to finer resolution of surface data considered here. MorphoTester represents an advancement

  16. Optimal Measurement Conditions for Spatiotemporal EEG/MEG Source Analysis.

    Science.gov (United States)

    Huizenga, Hilde M.; Heslenfeld, Dirk J.; Molenaar, Peter C. M.

    2002-01-01

    Developed a method to determine the required number and position of sensors for human brain electromagnetic source analysis. Studied the method through a simulation study and an empirical study on visual evoked potentials in one adult male. Results indicate the method is fast and reliable and improves source precision. (SLD)

  17. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  18. Technique of sample preparation for analysis of gasoline and lubricating oils by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Avila P, P.

    1990-03-01

    The X-ray fluorescence laboratory of the National Institute of Nuclear Research when not having a technique for the analysis of oils it has intended, with this work, to develop a preparation technique for the analysis of the metals of Pb, Cr, Ni, V and Mo in gasolines and oils, by means of the spectrometry by X-ray fluorescence analysis. The obtained results, its will be of great utility for the one mentioned laboratory. (Author)

  19. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Directory of Open Access Journals (Sweden)

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  20. Investigation and Applications of In-Source Oxidation in Liquid Sampling-Atmospheric Pressure Afterglow Microplasma Ionization (LS-APAG) Source.

    Science.gov (United States)

    Xie, Xiaobo; Wang, Zhenpeng; Li, Yafeng; Zhan, Lingpeng; Nie, Zongxiu

    2017-06-01

    A liquid sampling-atmospheric pressure afterglow microplasma ionization (LS-APAG) source is presented for the first time, which is embedded with both electrospray ionization (ESI) and atmospheric pressure afterglow microplasma ionization (APAG) techniques. This ion source is capable of analyzing compounds with diverse molecule weights and polarities. An unseparated mixture sample was detected as a proof-of-concept, giving complementary information (both polarities and non-polarities) with the two ionization modes. It should also be noted that molecular mass can be quickly identified by ESI with clean and simple spectra, while the structure can be directly studied using APAG with in-source oxidation. The ionization/oxidation mechanism and applications of the LS-APAG source have been further explored in the analysis of nonpolar alkanes and unsaturated fatty acids/esters. A unique [M + O - 3H] + was observed in the case of individual alkanes (C 5 -C 19 ) and complex hydrocarbons mixture under optimized conditions. Moreover, branched alkanes generated significant in-source fragments, which could be further applied to the discrimination of isomeric alkanes. The technique also facilitates facile determination of double bond positions in unsaturated fatty acids/esters due to diagnostic fragments (the acid/ester-containing aldehyde and acid oxidation products) generated by on-line ozonolysis in APAG mode. Finally, some examples of in situ APAG analysis by gas sampling and surface sampling were given as well. Graphical Abstract ᅟ.

  1. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pannek, Kerstin [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, School of Medicine, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); Guzzetta, Andrea [IRCCS Stella Maris, Department of Developmental Neuroscience, Calambrone Pisa (Italy); Colditz, Paul B. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Perinatal Research Centre, Brisbane (Australia); Rose, Stephen E. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); University of Queensland Centre for Clinical Research, Royal Brisbane and Women' s Hospital, Brisbane (Australia)

    2012-10-15

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. (orig.)

  2. Performance Analysis of the Microsoft Kinect Sensor for 2D Simultaneous Localization and Mapping (SLAM Techniques

    Directory of Open Access Journals (Sweden)

    Kamarulzaman Kamarudin

    2014-12-01

    Full Text Available This paper presents a performance analysis of two open-source, laser scanner-based Simultaneous Localization and Mapping (SLAM techniques (i.e., Gmapping and Hector SLAM using a Microsoft Kinect to replace the laser sensor. Furthermore, the paper proposes a new system integration approach whereby a Linux virtual machine is used to run the open source SLAM algorithms. The experiments were conducted in two different environments; a small room with no features and a typical office corridor with desks and chairs. Using the data logged from real-time experiments, each SLAM technique was simulated and tested with different parameter settings. The results show that the system is able to achieve real time SLAM operation. The system implementation offers a simple and reliable way to compare the performance of Windows-based SLAM algorithm with the algorithms typically implemented in a Robot Operating System (ROS. The results also indicate that certain modifications to the default laser scanner-based parameters are able to improve the map accuracy. However, the limited field of view and range of Kinect’s depth sensor often causes the map to be inaccurate, especially in featureless areas, therefore the Kinect sensor is not a direct replacement for a laser scanner, but rather offers a feasible alternative for 2D SLAM tasks.

  3. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  4. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  5. Contract Source Selection: An Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies

    Science.gov (United States)

    2016-06-15

    using- spss - statistics.php Lamoureux, J., Murrow, M., & Walls, C. (2015). Relationship of source selection methods to contract outcomes: an analysis ...Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff Strategies 15 June 2016 LCDR Jamal M. Osman, USN...ACQUISITION RESEARCH PROGRAM SPONSORED REPORT SERIES Contract Source Selection: an Analysis of Lowest Price Technically Acceptable and Tradeoff

  6. Quantitative analysis of occluded gases in uranium dioxide pellets by the mass spectrometry technique

    International Nuclear Information System (INIS)

    Vega Bustillos, J.O.W.; Rodrigues, C.; Iyer, S.S.

    1981-05-01

    A quantitative analysis of different components of occluded gases except water in uranium dioxide pellets is attempted here. A high temperature vacuum extration system is employed for the liberation and the determination of total volume of the occluded gases. A mass spectrometric technique is employed for the qualitative and quantitative analysis of these gases. The UO 2 pellets are placed in a graphite crucible and are subjected to varing temperatures (1000 0 C - 1700 0 C). The liberated gases are dehydrated and transferred to a measuring unit consisting essentially of a Toepler pump and a McLeod gauge. In this system the total volume of the gases liberated at N. T. P. is determined with a sensitivity of 0.002 cm 3 /g of UO 2 . An aliquot of the liberated gas is introduced into a quadrupole mass spectrometer (VGA-100 Varian Corp.) for the determination of the different components of the gas. On the basis of the analysis suggestions are made for the possible sources of these gas components. (Author) [pt

  7. Rapid nuclear forensics analysis via laser based microphotonic techniques coupled with chemometrics

    International Nuclear Information System (INIS)

    Bhatta, B.; Kalambuka, H.A.; Dehayem-Kamadjeu, A.

    2017-01-01

    Nuclear forensics (NF) is an important tool for analysis and attribution of nuclear and radiological materials (NRM) in support of nuclear security. The critical challenge in NF currently is the lack of suitable microanalytical methodologies for direct, rapid and minimally-invasive detection and quantification of NF signatures. Microphotonic techniques can achieve this task particularly when the materials are of limited size and under concealed condition. The purpose of this paper is to demonstrate the combined potential of chemometrics enabled LIBS and laser Raman spectromicroscopy (LRS) for rapid NF analysis and attribution. Using LIBS, uranium lines at 385.464 nm, 385.957 nm and 386.592 nm were identified as NF signatures in uranium ore surrogates. A multivariate calibration strategy using artificial neural network was developed for quantification of trace uranium. Principal component analysis (PCA) of LIBS spectra achieved source attribution of the ores. LRS studies on UCl3, UO3(NO3)2.6H2O, UO2SO4.3H2O and UO3 in pellet state identified the bands associated with different uranium molecules as varying in the range of (840 to 867) ± 15 cm-1. Using this signature, we have demonstrated spectral imaging of uranium under concealed conditions (author)

  8. Operational analysis and comparative evaluation of embedded Z-Source inverters

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Gao, F.; Loh, P.C.

    2008-01-01

    ) circuitry connected instead of the generic voltage source inverter (VSI) circuitry. Further proceeding on to the topological variation, parallel embedded Z-source inverters are presented with the detailed analysis of topological configuration and operational principles showing that they are the superior......This paper presents various embedded Z-source (EZ-source) inverters broadly classified as shunt or parallel embedded Z-source inverter. Being different from the traditional Z-source inverter, EZ-source inverters are constructed by inserting dc sources into the X-shaped impedance network so...... that the dc input current flows smoothly during the whole switching period unlike the traditional Z-source inverter. This feature is interesting when PV panels or fuel cells are assumed to power load since the continuous input current flow reduces control complexity of dc source and system design burden...

  9. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  10. Development of a hydrogen analysis using a small neutron source

    International Nuclear Information System (INIS)

    Ishikawa, I.; Tachikawa, N.; Tominaga, H.

    1998-01-01

    Most of industrial nuclear gauges are based on the use of radiation transmission through matter. This document presents new techniques to measure hydrogen using a small neutron source. A new technique has been developed for measuring the thickness of a thin layer of 30-200 μm thick plastic, which is sandwiched between two sheets of 0.6-4.2 mm in total thickness. Another technique allows to monitor residual moisture in wet refractory newly coated on the inner surface of a steel vessel from its outside through a thick steel plate. For saving on the use of coke and for strict control of furnace heating in the iron making process a new type moisture gauge was developed using simultaneous measurement of transmission rates of both fast neutrons and gamma rays from 252 Cf

  11. High order statistical signatures from source-driven measurements of subcritical fissile systems

    International Nuclear Information System (INIS)

    Mattingly, J.K.

    1998-01-01

    This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements

  12. Radioisotope sources for X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Leonowich, J.; Pandian, S.; Preiss, I.L.

    1977-01-01

    Problems involved in developing radioisotope sources and the characteristics of potentially useful radioisotopes for X-ray fluorescence analysis are presented. These include the following. The isotope must be evaluated for the physical and chemical forms available, purity, half-life, specific activity, toxicity, and cost. The radiation hazards of the source must be considered. The type and amount of radiation output of the source must be evaluated. The source construction must be planned. The source should also present an advance over those currently available in order to justify its development. Some of the isotopes, which are not in use but look very promising, are indicated, and their data are tabulated. A more or less ''perfect'' source within a given range of interest would exhibit the following characteristics. (1) Decay by an isometric transition with little or no internal conversion, (2) Have an intense gamma transition near the absorption edge of the element(s) of interest with no high energy gammas, (3) Have a sufficiently long half-life (in the order of years) for both economic and calibration reasons, (4) Have a sufficiently large cross-section for production in a reasonable amount of time. If there are competing reactions the interfering isotopes should be reasonably short-lived, or if not, be apt to be separated from the isotope chemically with a minimum of difficulty. (T.G.)

  13. Fiscal 1998 development report on the high-accuracy quantitative analysis technique of catalyst surfaces by electron spectroscopy; 1998 nendo denshi bunkoho ni yoru shokubai hyomen koseido teiryo bunseki gijutsu no kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    This project aims at development of the high-accuracy quantitative analysis technique by electron spectroscopy for surface analysis of catalysts and semiconductors. Since conventional analysis technique using an energy-fixed X-ray excitation source is inadequate to obtain satisfactory surface sensitivity and quantitative accuracy for catalysts, for development of the titled technique, this project makes experiment using energy-variable synchrotron radiation to modify the parameter on motion of low-speed electrons in solids which is obtained by Monte Carlo calculation. For establishment of the high-accuracy quantitative analysis technique of surface compositions of materials such as catalyst of which performance is dominated by utmost surface, the project studies the attenuation length of electrons in solids by electron spectroscopy using soft X-rays from synchrotron radiation. In this fiscal year, the project established the equipment and technique for high-accuracy quantitative analysis of the thickness and electron attenuation length of silicon oxide films on silicon wafers by electron spectroscopy. (NEDO)

  14. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  15. Topics: in vivo measurement of thyroidal iodine content by x-ray fluorescent technique

    International Nuclear Information System (INIS)

    Imamura, Keiko

    1979-01-01

    Thyroidal iodine content gives useful informations in the fields of physiology, clinical medicine, health physics etc. Iodine content has been determined mainly for resected thyroids. Recently, x-ray fluorescent analysis has been extended as the in vivo technique first in the clinical medicine. Exciting sources used for the analysis of the thyroid are Am-241 or x-ray tube. Am-241 has a half-life of 438 years and emits #betta#-ray of 60 keV. Thyroid can be imaged by fluorescent scan utilizing strong (10 - 15 Ci) Am-241 source. Examination time is about 15 min and the radiation dose to the gland is about 15 - 60 mrad. Iodine content is determined by static fluorescent technique equipped with weaker source of less than 1 Ci. Thyroidal iodine content in normal subjects were analysed by this technique and the results were in good accordance with those obtained by in vitro analysis. Difference in the thyroidal iodine content between the Japanese population and other countries is not clear. Application to the pathological cases has provided many findings about the iodine content and its distribution which could not be obtained by in vitro analysis. This in vivo technique can be safely performed for infants and for pregnancies, and the relatively compact size of this apparatus could be widely used in the study of health physics and environmental problems. (author)

  16. Analysis of the tuning characteristics of microwave plasma source

    Energy Technology Data Exchange (ETDEWEB)

    Miotk, Robert, E-mail: rmiotk@imp.gda.pl; Jasiński, Mariusz [Centre for Plasma and Laser Engineering, The Szewalski Institute of Fluid-Flow Machinery, Polish Academy of Sciences, Fiszera 14, 80-231 Gdańsk (Poland); Mizeraczyk, Jerzy [Department of Marine Electronics, Gdynia Maritime University, Morska 81-87, 81-225 Gdynia (Poland)

    2016-04-15

    In this paper, we present an analysis of the tuning characteristics of waveguide-supplied metal-cylinder-based nozzleless microwave plasma source. This analysis has enabled to estimate the electron concentration n{sub e} and electron frequency collisions ν in the plasma generated in nitrogen and in a mixture of nitrogen and ethanol vapour. The parameters n{sub e} and ν are the basic quantities that characterize the plasma. The presented new plasma diagnostic method is particularly useful, when spectroscopic methods are useless. The presented plasma source is currently used in research of a hydrogen production from liquids.

  17. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  18. Screening of oil sources by using comprehensive two-dimensional gas chromatography/time-of-flight mass spectrometry and multivariate statistical analysis.

    Science.gov (United States)

    Zhang, Wanfeng; Zhu, Shukui; He, Sheng; Wang, Yanxin

    2015-02-06

    Using comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry (GC×GC/TOFMS), volatile and semi-volatile organic compounds in crude oil samples from different reservoirs or regions were analyzed for the development of a molecular fingerprint database. Based on the GC×GC/TOFMS fingerprints of crude oils, principal component analysis (PCA) and cluster analysis were used to distinguish the oil sources and find biomarkers. As a supervised technique, the geological characteristics of crude oils, including thermal maturity, sedimentary environment etc., are assigned to the principal components. The results show that tri-aromatic steroid (TAS) series are the suitable marker compounds in crude oils for the oil screening, and the relative abundances of individual TAS compounds have excellent correlation with oil sources. In order to correct the effects of some other external factors except oil sources, the variables were defined as the content ratio of some target compounds and 13 parameters were proposed for the screening of oil sources. With the developed model, the crude oils were easily discriminated, and the result is in good agreement with the practical geological setting. Copyright © 2014 Elsevier B.V. All rights reserved.

  19. Use of Atomic and Nuclear Techniques in Elemental and Isotopic Analysis

    International Nuclear Information System (INIS)

    2008-01-01

    This book is divided into four chapters which were presented by six authors of the best Arab specialists who have used the atomic and nuclear techniques for a long time and recognized their importance and capabilities in scientific researches. Atomic and Nuclear techniques are very successful in the field of analysis because they are the only way to proceed the analysis process with the requested accuracy and they are the cheapest at the same time. A number of these techniques were collected in this book on the basis of their accuracy and the abundance of using them in the analysis of material components, specially when these elements exist with insignificant percentage as in the case of poisons science, archaeology, nutrition, medicine and other applications.

  20. An overview of the RCA/IAEA activities in the Australasian region using nuclear analysis techniques for monitoring air pollution

    International Nuclear Information System (INIS)

    Markwitz, Andreas

    2005-01-01

    The International Atomic Energy Agency (IAEA) via the Regional Co-operative Agreement (RCA) has identified air particulate matter pollution as a major transboundary environmental issue in the Australasian region. Sixteen countries in the region spanning from Pakistan to the Philippines and from China to New Zealand are participating in the regional programme RAS/7/013-Improved information of urban air quality management in the RCA region' that started in 1997. New Zealand is the lead-country for this programme in which nuclear analytical techniques, such as particle induced X-ray emission (PIXE), neutron activation analysis (NAA) and X-ray fluorescence spectrometry (XRF) are used to measure key elements in PM 2.5-0 and PM 10-2.5 filters from GENT stacked samplers collected twice weekly. Major sources of air particulate matter pollution are identified using statistical source apportionment techniques. To identify transboundary air particulate matter pollution events, the data is collated in a large database. Additionally, the data is used by end-users of the participating countries in the programme. An overview is presented. (author)

  1. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  2. Authentication of pure L-leucine products manufactured in China by discriminating between plant and animal sources using nitrogen stable isotope technique.

    Science.gov (United States)

    Huang, Jingyu; Nkrumah, Philip N; Appiah-Sefah, Gloria; Tang, Shijiang

    2013-03-01

     L-leucine products among other branched chain amino acid supplements are highly susceptible to economically motivated adulteration. Curbing this menace is critical and timely. Hence, the δ(15) N composition of the L-leucine derived from plants and animals sources was estimated. The trophic enrichment phenomenon of δ(15) N composition was utilized to elucidate the sources. We finally established the distinction between the respective sources. Samples of plant sources (maize and soybean) and that of animal sources (pig fur and duck feather) were analyzed for δ(15) N isotopic signatures. An elemental analyzer which was connected to an isotope ratio mass spectrometer operated in the continuous flow mode was utilized. The raw materials were obtained from China. Statistical analysis was performed using descriptive statistics and one-way analysis of variance. The results indicated lower δ(15) N values of range -0.7344‰ to 2.384‰ and 1.032‰ to 2.064‰ for maize and soybean samples, respectively. Whereas, a range of 3.860‰ to 6.011‰ and 5.875‰ to 6.011‰ was, respectively, detected in pig fur and duck feather samples. The δ(15) N difference in plants and animals samples was significant (F = 165.0; P = 1.675 E-10 for maize and pig fur samples; F = 212.8; P = 0.0001284 for soybean and duck feather samples). It was observed that δ(15) N trophic enrichment is helpful in elucidating the respective sources. The authors can emphatically assert that the range of δ(15) N composition of L-leucine derived from plants sources within the study area is -1.000‰ to 3.000‰ whereas the range in animal sources is 4.000‰ to 9.000‰. Practical Application This study provides a reliable approach in verifying the authenticity of not only L-leucine products but also other branched chain amino acid supplements and thereby would help in fraud detection of any economically motivated adulteration and mislabeling of these products. When coupled with H and O stable

  3. A SOUND SOURCE LOCALIZATION TECHNIQUE TO SUPPORT SEARCH AND RESCUE IN LOUD NOISE ENVIRONMENTS

    Science.gov (United States)

    Yoshinaga, Hiroshi; Mizutani, Koichi; Wakatsuki, Naoto

    At some sites of earthquakes and other disasters, rescuers search for people buried under rubble by listening for the sounds which they make. Thus developing a technique to localize sound sources amidst loud noise will support such search and rescue operations. In this paper, we discuss an experiment performed to test an array signal processing technique which searches for unperceivable sound in loud noise environments. Two speakers simultaneously played a noise of a generator and a voice decreased by 20 dB (= 1/100 of power) from the generator noise at an outdoor space where cicadas were making noise. The sound signal was received by a horizontally set linear microphone array 1.05 m in length and consisting of 15 microphones. The direction and the distance of the voice were computed and the sound of the voice was extracted and played back as an audible sound by array signal processing.

  4. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  5. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  6. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  7. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  8. Blind source separation problem in GPS time series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2016-04-01

    A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition

  9. Extracting functional components of neural dynamics with Independent Component Analysis and inverse Current Source Density.

    Science.gov (United States)

    Lęski, Szymon; Kublik, Ewa; Swiejkowski, Daniel A; Wróbel, Andrzej; Wójcik, Daniel K

    2010-12-01

    Local field potentials have good temporal resolution but are blurred due to the slow spatial decay of the electric field. For simultaneous recordings on regular grids one can reconstruct efficiently the current sources (CSD) using the inverse Current Source Density method (iCSD). It is possible to decompose the resultant spatiotemporal information about the current dynamics into functional components using Independent Component Analysis (ICA). We show on test data modeling recordings of evoked potentials on a grid of 4 × 5 × 7 points that meaningful results are obtained with spatial ICA decomposition of reconstructed CSD. The components obtained through decomposition of CSD are better defined and allow easier physiological interpretation than the results of similar analysis of corresponding evoked potentials in the thalamus. We show that spatiotemporal ICA decompositions can perform better for certain types of sources but it does not seem to be the case for the experimental data studied. Having found the appropriate approach to decomposing neural dynamics into functional components we use the technique to study the somatosensory evoked potentials recorded on a grid spanning a large part of the forebrain. We discuss two example components associated with the first waves of activation of the somatosensory thalamus. We show that the proposed method brings up new, more detailed information on the time and spatial location of specific activity conveyed through various parts of the somatosensory thalamus in the rat.

  10. Earthquake source imaging by high-resolution array analysis at regional distances: the 2010 M7 Haiti earthquake as seen by the Venezuela National Seismic Network

    Science.gov (United States)

    Meng, L.; Ampuero, J. P.; Rendon, H.

    2010-12-01

    Back projection of teleseismic waves based on array processing has become a popular technique for earthquake source imaging,in particular to track the areas of the source that generate the strongest high frequency radiation. The technique has been previously applied to study the rupture process of the Sumatra earthquake and the supershear rupture of the Kunlun earthquakes. Here we attempt to image the Haiti earthquake using the data recorded by Venezuela National Seismic Network (VNSN). The network is composed of 22 broad-band stations with an East-West oriented geometry, and is located approximately 10 degrees away from Haiti in the perpendicular direction to the Enriquillo fault strike. This is the first opportunity to exploit the privileged position of the VNSN to study large earthquake ruptures in the Caribbean region. This is also a great opportunity to explore the back projection scheme of the crustal Pn phase at regional distances,which provides unique complementary insights to the teleseismic source inversions. The challenge in the analysis of the 2010 M7.0 Haiti earthquake is its very compact source region, possibly shorter than 30km, which is below the resolution limit of standard back projection techniques based on beamforming. Results of back projection analysis using the teleseismic USarray data reveal little details of the rupture process. To overcome the classical resolution limit we explored the Multiple Signal Classification method (MUSIC), a high-resolution array processing technique based on the signal-noise orthognality in the eigen space of the data covariance, which achieves both enhanced resolution and better ability to resolve closely spaced sources. We experiment with various synthetic earthquake scenarios to test the resolution. We find that MUSIC provides at least 3 times higher resolution than beamforming. We also study the inherent bias due to the interferences of coherent Green’s functions, which leads to a potential quantification

  11. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  12. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  13. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  14. A review of second law techniques applicable to basic thermal science research

    Science.gov (United States)

    Drost, M. Kevin; Zamorski, Joseph R.

    1988-11-01

    This paper reports the results of a review of second law analysis techniques which can contribute to basic research in the thermal sciences. The review demonstrated that second law analysis has a role in basic thermal science research. Unlike traditional techniques, second law analysis accurately identifies the sources and location of thermodynamic losses. This allows the development of innovative solutions to thermal science problems by directing research to the key technical issues. Two classes of second law techniques were identified as being particularly useful. First, system and component investigations can provide information of the source and nature of irreversibilities on a macroscopic scale. This information will help to identify new research topics and will support the evaluation of current research efforts. Second, the differential approach can provide information on the causes and spatial and temporal distribution of local irreversibilities. This information enhances the understanding of fluid mechanics, thermodynamics, and heat and mass transfer, and may suggest innovative methods for reducing irreversibilities.

  15. Characterisation of air particulate matter in Klang Valley by neutron activation analysis technique

    International Nuclear Information System (INIS)

    Mohd Suhaimi Hamzah; Shamsiah Abd Rahman; Mohd Khalid Matori; Abd Khalik Wood

    2000-01-01

    Air particulate matter is known to affect human health, impairs visibility and can cause climate change. Study on air particulate matter in term of particle size and chemical contents is very important to indicate the quality of air in a sampling area. Information on concentration of important constituents in air particles can be used to identify some of emission sources which contribute to the pollution problem. The data collected may also be, used as a basis to design a strategy in order to overcome the air pollution problem in the area. The study involved sampling of air dust at two stations, one in Bangi and the other in Kuala Lumpur using Gent Stack Sampler units. Each sampler capable of collecting air particle sizes smaller than 2.5 micron (PM 2.5) and between 2.5 - O micron on two different filters simultaneously. The filters were measured for their mass, elemental carbon and elemental concentrations using analytical equipment or techniques including reflectometer and Neutron Activation Analysis. The results of analysis on samples collected in 1997-1998 are discussed. (author)

  16. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  17. Electron-volt spectroscopy at a pulsed neutron source using a resonance detector technique

    CERN Document Server

    Andreani, C; Senesi, R; Gorini, G; Tardocchi, M; Bracco, A; Rhodes, N; Schooneveld, E M

    2002-01-01

    The effectiveness of the neutron resonance detector spectrometer for deep inelastic neutron scattering measurements has been assessed by measuring the Pb scattering on the eVS spectrometer at ISIS pulsed neutron source and natural U foils as (n,gamma) resonance converters. A conventional NaI scintillator with massive shielding has been used as gamma detector. A neutron energy window up to 90 eV, including four distinct resonance peaks, has been assessed. A net decrease of the intrinsic width of the 6.6 eV resonance peak has also been demonstrated employing the double difference spectrum technique, with two uranium foils of different thickness.

  18. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  19. Transient thermal stress analysis of a near-edge elliptical defect in a semi-infinite plate subjected to a moving heat source

    International Nuclear Information System (INIS)

    Mingjong Wang; Weichung Wang

    1994-01-01

    In this paper, the maximum transient thermal stresses on the boundary of a near-edge elliptical defect in a semi-infinite thin plate were determined by the digital photoelastic technique, when the plate edge experiences a moving heat source. The relationships between the maximum transient thermal stresses and the size and inclination of the elliptical defect, the minimum distance from the elliptical defect to the plate edge as well as the speed of the moving heat source were also studied. Finally, by using a statistical analysis package, the variations of the maximum transient thermal stresses were then correlated with the time, the minimum distance between the edge and the elliptical defect, temperature difference, and speed of the moving heat source. (author)

  20. Analysis of deployment techniques for webbased applications in SMEs

    OpenAIRE

    Browne, Cathal

    2011-01-01

    The Internet is no longer just a source for accessing information; it has become a valuable medium for social networking and software services. Web-browsers can now access entire software systems available online to provide the user with a range of services. The concept of software as a service(SAAS) was born out of this. The number of development techniques and frameworks for such web-applications has grown rapidly and much research and development has been carried out on adva...

  1. An investigation of tungsten by neutron activation techniques

    International Nuclear Information System (INIS)

    Svetsreni, R.

    1978-01-01

    This investigation used neutron from Plutonium-Beryllium source (5 curie) to analyse the amount of tungsten in tungsten oxide which was extracted from tungsten ores, slag and tungsten alloy of tungsten iron and carbon. The technique of neutron activation analysis with NaI(Tl) gamma detector 3'' x 3'' and 1024 multichannel analyzer. The dilution technique was used by mixing Fe 2 O 3 or pure sand into the sample before irradiation. In this study self shielding effect in the analysis of tungsten was solved and the detection limit of the tungsten in the sample was about 0.5%

  2. The quantitative analysis of 163Ho source by PIXE

    International Nuclear Information System (INIS)

    Sera, K.; Ishii, K.; Fujioka, M.; Izawa, G.; Omori, T.

    1984-01-01

    We have been studying the electron-capture in 163 Ho as a method for determining the mass of electron neutrino. The 163 Ho sources were produced with the 164 Dy(p,2n) reaction by means of a method of internal irradiation 2 ). We applied the PIXE method to determine the total number of 163 Ho atoms in the source. Proton beams of 3 MeV and a method of ''external standard'' were employed for nondestructive analysis of the 163 Ho source as well as an additional method of ''internal standard''. (author)

  3. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  4. Motor current and leakage flux signature analysis technique for condition monitoring

    International Nuclear Information System (INIS)

    Pillai, M.V.; Moorthy, R.I.K.; Mahajan, S.C.

    1994-01-01

    Till recently analysis of vibration signals was the only means available to predict the state of health of plant equipment. Motor current and leakage magnetic flux signature analysis is acquiring importance as a technique for detection of incipient damages in the electrical machines and as a supplementary technique for diagnostics of driven equipment such as centrifugal and reciprocating pumps. The state of health of the driven equipment is assessed by analysing time signal, frequency spectrum and trend analysis. For example, the pump vane frequency, piston stroke frequency, gear frequency and bearing frequencies are indicated in the current and flux spectra. By maintaining a periodic record of the amplitudes of various frequency lines in the frequency spectra, it is possible to understand the trend of deterioration of parts and components of the pump. All problems arising out of inappropriate mechanical alignment of vertical pumps are easily identified by a combined analysis of current, flux and vibration signals. It is found that current signature analysis technique is a sufficient method in itself for the analysis of state of health of reciprocating pumps and compressors. (author). 10 refs., 4 figs

  5. Model Predictive Control techniques with application to photovoltaic, DC Microgrid, and a multi-sourced hybrid energy system

    Science.gov (United States)

    Shadmand, Mohammad Bagher

    Renewable energy sources continue to gain popularity. However, two major limitations exist that prevent widespread adoption: availability and variability of the electricity generated and the cost of the equipment. The focus of this dissertation is Model Predictive Control (MPC) for optimal sized photovoltaic (PV), DC Microgrid, and multi-sourced hybrid energy systems. The main considered applications are: maximum power point tracking (MPPT) by MPC, droop predictive control of DC microgrid, MPC of grid-interaction inverter, MPC of a capacitor-less VAR compensator based on matrix converter (MC). This dissertation firstly investigates a multi-objective optimization technique for a hybrid distribution system. The variability of a high-penetration PV scenario is also studied when incorporated into the microgrid concept. Emerging (PV) technologies have enabled the creation of contoured and conformal PV surfaces; the effect of using non-planar PV modules on variability is also analyzed. The proposed predictive control to achieve maximum power point for isolated and grid-tied PV systems speeds up the control loop since it predicts error before the switching signal is applied to the converter. The low conversion efficiency of PV cells means we want to ensure always operating at maximum possible power point to make the system economical. Thus the proposed MPPT technique can capture more energy compared to the conventional MPPT techniques from same amount of installed solar panel. Because of the MPPT requirement, the output voltage of the converter may vary. Therefore a droop control is needed to feed multiple arrays of photovoltaic systems to a DC bus in microgrid community. Development of a droop control technique by means of predictive control is another application of this dissertation. Reactive power, denoted as Volt Ampere Reactive (VAR), has several undesirable consequences on AC power system network such as reduction in power transfer capability and increase in

  6. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  7. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  8. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  9. Isotopic neutron sources for neutron activation analysis

    International Nuclear Information System (INIS)

    Hoste, J.

    1988-06-01

    This User's Manual is an attempt to provide for teaching and training purposes, a series of well thought out demonstrative experiments in neutron activation analysis based on the utilization of an isotopic neutron source. In some cases, these ideas can be applied to solve practical analytical problems. 19 refs, figs and tabs

  10. Air pollution studies in Tianjing city using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    Ni Bangfa; Tian Weizhi; Nie Nuiling; Wang Pingsheng

    1999-01-01

    Two sites of airborne sampling from industrial and residential areas were made in Tianjing city during February and June using PM-10 sampler and analyzed by NAA techniques; Comparison of air pollution between urban and rural area in Tianjing city was made using neutron activation analysis techniques and some other data analyzing techniques. (author)

  11. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  12. Colour and shape analysis techniques for weed detection in cereal fields

    DEFF Research Database (Denmark)

    Pérez, A.J; López, F; Benlloch, J.V.

    2000-01-01

    . The proposed methods use colour information to discriminate between vegetation and background, whilst shape analysis techniques are applied to distinguish between crop and weeds. The determination of crop row position helps to reduce the number of objects to which shape analysis techniques are applied....... The performance of algorithms was assessed by comparing the results with a human classification, providing an acceptable success rate. The study has shown that despite the difficulties in accurately determining the number of seedlings (as in visual surveys), it is feasible to use image processing techniques......Information on weed distribution within the field is necessary to implement spatially variable herbicide application. This paper deals with the development of near-ground image capture and processing techniques in order to detect broad-leaved weeds in cereal crops under actual field conditions...

  13. Source apportionment of toxic chemical pollutants at Trombay region

    International Nuclear Information System (INIS)

    Sahu, S.K.; Pandit, G.G.; Puranik, V.D.

    2007-05-01

    Anthropogenic activities like industrial production and transportation, a wide range of chemical pollutants such as trace and toxic metals, pesticides, polycyclic aromatic hydrocarbons etc. eventually find their way into various environmental compartments. One of the main issues of environmental pollution is the chemical composition of aerosols and their sources. In spite of all the efforts a considerable part of the atmospheric aerosol mass is still not accounted for. This report describes some of the activities of Environmental Assessment Division which are having direct relevance to the public health and regulatory bodies. Extensive studies were carried out in our laboratories for the Trombay site, over the years; on the organic as well as inorganic pollution in the environment to understand inter compartmental behaviour of these chemical pollutants. In this report an attempt has been made to collect different size fractionated ambient aerosols and to quantify the percentage contribution of each size fraction to the total aerosol mass. Subsequently, an effort has been made for chemical characterization (inorganic, organic and carbon content) of these particulate matter using different analytical techniques. The comprehensive data set on chemical characterization of particulate matter thus generated is being used with receptor modeling techniques to identify the possible sources contributing to the observed concentrations of the measured pollutants. The use of this comprehensive data set in receptor modeling has been helpful in distinguishing the source types in a better way. Receptor modeling techniques are powerful tools that can be used to locate sources of pollutants to the atmosphere. The major advantage of the receptor models is that actual ambient data are used to apportion source contributions, negating the need for dispersion calculations. Pollution sources affecting the sampling site were statistically identified using varimax rotated factor analysis of

  14. Bioimaging of cells and tissues using accelerator-based sources.

    Science.gov (United States)

    Petibois, Cyril; Cestelli Guidi, Mariangela

    2008-07-01

    A variety of techniques exist that provide chemical information in the form of a spatially resolved image: electron microprobe analysis, nuclear microprobe analysis, synchrotron radiation microprobe analysis, secondary ion mass spectrometry, and confocal fluorescence microscopy. Linear (LINAC) and circular (synchrotrons) particle accelerators have been constructed worldwide to provide to the scientific community unprecedented analytical performances. Now, these facilities match at least one of the three analytical features required for the biological field: (1) a sufficient spatial resolution for single cell (pros and cons of the most popular techniques that have been implemented on accelerator-based sources to address analytical issues on biological specimens.

  15. Data analysis and source modelling for LISA

    International Nuclear Information System (INIS)

    Shang, Yu

    2014-01-01

    The gravitational waves are one of the most important predictions in general relativity. Besides of the directly proof of the existence of GWs, there are already several ground based detectors (such as LIGO, GEO, etc) and the planed future space mission (such as: LISA) which are aim to detect the GWs directly. GW contain a large amount of information of its source, extracting these information can help us dig out the physical property of the source, even open a new window for understanding the Universe. Hence, GW data analysis will be a challenging task in seeking the GWs. In this thesis, I present two works about the data analysis for LISA. In the first work, we introduce an extended multimodal genetic algorithm which utilizes the properties of the signal and the detector response function to analyze the data from the third round of mock LISA data challenge. We have found all five sources present in the data and recovered the coalescence time, chirp mass, mass ratio and sky location with reasonable accuracy. As for the orbital angular momentum and two spins of the Black Holes, we have found a large number of widely separated modes in the parameter space with similar maximum likelihood values. The performance of this method is comparable, if not better, to already existing algorithms. In the second work, we introduce an new phenomenological waveform model for the extreme mass ratio inspiral system. This waveform consists of a set of harmonics with constant amplitude and slowly evolving phase which we decompose in a Taylor series. We use these phenomenological templates to detect the signal in the simulated data, and then, assuming a particular EMRI model, estimate the physical parameters of the binary with high precision. The results show that our phenomenological waveform is very feasible in the data analysis of EMRI signal.

  16. Nuclear techniques for on-line analysis in the mineral and energy industries

    International Nuclear Information System (INIS)

    Sowerby, B.D.; Watt, J.S.

    1994-01-01

    Nuclear techniques are the basis of many on-line analysis systems which are now widely used in the mineral and energy industries. Some of the systems developed by the CSIRO depend entirely on nuclear techniques; others use a combination of nuclear techniques and microwave, capacitance, or ultrasonic techniques. The continuous analysis and rapid response of these CSIRO systems has led to improved control of mining, processing and blending operations, with increased productivity valued at A$50 million per year to Australia, and $90 million per year world wide. This paper reviews developments in nuclear on-line analysis systems by the On-Line Analysis Group in CSIRO at Lucas Heights. Commercialised systems based on this work analyse mineral and coal slurries and determine the ash and moisture contents of coal and coke on conveyors. This paper also reviews two on-line nuclear analysis systems recently developed and licensed to industry, firstly for the determination of the mass flow rates of oil/water/gas mixtures in pipelines, and secondly for determination of the moisture, specific energy, ash and fouling index in low rank coals. 8 refs., 3 tabs., 4 figs

  17. Quality-assurance techniques used with automated analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Killian, E.W.; Koeppen, L.D.; Femec, D.A.

    1994-01-01

    In the course of developing gamma-ray spectrum analysis algorithms for use by the Radiation Measurements Laboratory at the Idaho National Engineering Laboratory (INEL), several techniques have been developed that enhance and verify the quality of the analytical results. The use of these quality-assurance techniques is critical when gamma-ray analysis results from low-level environmental samples are used in risk assessment or site restoration and cleanup decisions. This paper describes four of the quality-assurance techniques that are in routine use at the laboratory. They are used for all types of samples, from reactor effluents to environmental samples. The techniques include: (1) the use of precision pulsers (with subsequent removal) to validate the correct operation of the spectrometer electronics for each and every spectrum acquired, (2) the use of naturally occurring and cosmically induced radionuclides in samples to help verify that the data acquisition and analysis were performed properly, (3) the use of an ambient background correction technique that involves superimposing (open-quotes mappingclose quotes) sample photopeak fitting parameters onto multiple background spectra for accurate and more consistent quantification of the background activities, (4) the use of interactive, computer-driven graphics to review the automated locating and fitting of photopeaks and to allow for manual fitting of photopeaks

  18. Water quality assessment and apportionment of pollution sources using APCS-MLR and PMF receptor modeling techniques in three major rivers of South Florida.

    Science.gov (United States)

    Haji Gholizadeh, Mohammad; Melesse, Assefa M; Reddi, Lakshmi

    2016-10-01

    In this study, principal component analysis (PCA), factor analysis (FA), and the absolute principal component score-multiple linear regression (APCS-MLR) receptor modeling technique were used to assess the water quality and identify and quantify the potential pollution sources affecting the water quality of three major rivers of South Florida. For this purpose, 15years (2000-2014) dataset of 12 water quality variables covering 16 monitoring stations, and approximately 35,000 observations was used. The PCA/FA method identified five and four potential pollution sources in wet and dry seasons, respectively, and the effective mechanisms, rules and causes were explained. The APCS-MLR apportioned their contributions to each water quality variable. Results showed that the point source pollution discharges from anthropogenic factors due to the discharge of agriculture waste and domestic and industrial wastewater were the major sources of river water contamination. Also, the studied variables were categorized into three groups of nutrients (total kjeldahl nitrogen, total phosphorus, total phosphate, and ammonia-N), water murkiness conducive parameters (total suspended solids, turbidity, and chlorophyll-a), and salt ions (magnesium, chloride, and sodium), and average contributions of different potential pollution sources to these categories were considered separately. The data matrix was also subjected to PMF receptor model using the EPA PMF-5.0 program and the two-way model described was performed for the PMF analyses. Comparison of the obtained results of PMF and APCS-MLR models showed that there were some significant differences in estimated contribution for each potential pollution source, especially in the wet season. Eventually, it was concluded that the APCS-MLR receptor modeling approach appears to be more physically plausible for the current study. It is believed that the results of apportionment could be very useful to the local authorities for the control and

  19. I. Forensic data analysis by pattern recognition. Categorization of white bond papers by elemental composition. II. Source identification of oil spills by pattern recognition analysis of natural elemental composition. III. Improving the reliability of factor analysis of chemical measured analytical data by utilizing the measured analytical uncertainity. IV. Elucidating the structure of some clinical data

    International Nuclear Information System (INIS)

    Duewer, D.L.

    1977-01-01

    Pattern recognition techniques are applied to the analysis of white bond papers and the problem of determining the source of an oil spill. In each case, an elemental analysis by neutron activation is employed. For the determination of source of oil spills, the field sample was weathered prior to activation analysis. A procedure for including measured analytical uncertainty into data analysis methodology is discussed, with particular reference to factor analysis. The suitability of various dispersion matrices and matrix rank determination criteria for data having analytical uncertainty is investigated. A criterion useful for judging the number of factors insensitive to analytical uncertainty is presented. A model data structure for investigating the behavior of factor analysis techniques in a known, controlled manner is described and analyzed. A chemically interesting test data base having analytical uncertainty is analyzed and compared with the model data. The data structure of 22 blood constituents in three categories of liver disease (viral or toxic hepatitis, alcoholic liver diseases and obstructive processes) is studied using various statistical and pattern recognition techniques. Comparison of classification results on the original data, in combination with principal component analysis, suggests a possible underlying structure for the data. This model structure is tested by the application of two simple data transformations. Analysis of the transformed data appears to confirm that some basic understanding of the studied data has been achieved

  20. Comparative Study of Radon Concentration with Two Techniques and Elemental Analysis in Drinking Water Samples of the Jammu District, Jammu and Kashmir, India.

    Science.gov (United States)

    Kumar, Ajay; Kaur, Manpreet; Mehra, Rohit; Sharma, Dinesh Kumar; Mishra, Rosaline

    2017-10-01

    The level of radon concentration has been assessed using the Advanced SMART RnDuo technique in 30 drinking water samples from Jammu district, Jammu and Kashmir, India. The water samples were collected from wells, hand pumps, submersible pumps, and stored waters. The randomly obtained 14 values of radon concentration in water sources using the SMART RnDuo technique have been compared and cross checked by a RAD7 device. A good positive correlation (R = 0.88) has been observed between the two techniques. The overall value of radon concentration in various water sources has ranged from 2.45 to 18.43 Bq L, with a mean value of 8.24 ± 4.04 Bq L, and it agreed well with the recommended limit suggested by the European Commission and UNSCEAR. However, the higher activity of mean radon concentration was found in groundwater drawn from well, hand and submersible pumps as compared to stored water. The total annual effective dose due to radon inhalation and ingestion ranged from 6.69 to 50.31 μSv y with a mean value of 22.48 ± 11.03 μSv y. The total annual effective dose was found to lie within the safe limit (100 μSv y) suggested by WHO. Heavy metal analysis was also carried out in various water sources by using an atomic absorption spectrophotometer (AAS), and the highest value of heavy metals was found mostly in groundwater samples. The obtained results were compared with Indian and International organizations like WHO and the EU Council. Among all the samples, the elemental analysis is not on the exceeding side of the permissible limit.

  1. Method for assessing the probability of accumulated doses from an intermittent source using the convolution technique

    International Nuclear Information System (INIS)

    Coleman, J.H.

    1980-10-01

    A technique is discussed for computing the probability distribution of the accumulated dose received by an arbitrary receptor resulting from several single releases from an intermittent source. The probability density of the accumulated dose is the convolution of the probability densities of doses from the intermittent releases. Emissions are not assumed to be constant over the brief release period. The fast fourier transform is used in the calculation of the convolution

  2. Overcomplete Blind Source Separation by Combining ICA and Binary Time-Frequency Masking

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan

    2005-01-01

    a novel method for over-complete blind source separation. Two powerful source separation techniques have been combined, independent component analysis and binary time-frequency masking. Hereby, it is possible to iteratively extract each speech signal from the mixture. By using merely two microphones we...

  3. Analysis of rocks involving the x-ray diffraction, infrared and thermal gravimetric techniques

    International Nuclear Information System (INIS)

    Ikram, M.; Rauf, M.A.; Munir, N.

    1998-01-01

    Chemical analysis of rocks and minerals are usually obtained by a number of analytical techniques. The purpose of present work is to investigate the chemical composition of the rock samples and also to find that how far the results obtained by different instrumental methods are closely related. Chemical tests wee performed before using the instrumental techniques in order to determined the nature of these rocks. The chemical analysis indicated mainly the presence of carbonate and hence the carbonate nature of these rocks. The x-ray diffraction, infrared spectroscopy and thermal gravimetric analysis techniques were used for the determination of chemical composition of these samples. The results obtained by using these techniques have shown a great deal of similarities. (author)

  4. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  5. Radionuclide and electric accelerator sources for food irradiation

    International Nuclear Information System (INIS)

    Lagunas-Solar, M.C.; Matthews, S.M.

    1985-01-01

    Radiation processing of food requires radiation sources with high intensity, penetrability, reliability, and the flexibility to be adapted to current food processing techniques. Current proposed regulations limit the radiation sources which can be utilized to radionuclides or electrically-driven accelerators. Therefore, the power, throughput, and use efficiency of these sources are important factors affecting the design, installation, operation, and economics of large-scale food-processing facilities. An analysis of the advantages and disadvantages of these sources is presented here, with special attention to the current status of both technologies, and with emphasis on the needs of the food-processing industry. (author)

  6. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  7. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  8. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  9. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  10. Chromium Fractions Changes Compared With Total-Cr As Determined by Neutron Activation Analysis Technique

    International Nuclear Information System (INIS)

    Abdel-Sabour, M.F.; Abdou, F.M.; Elwan, I.M.; Al-Salama, Y.J.

    2003-01-01

    Fifteen soil samples were chosen from different locations (five different locations at north greater Cairo, Egypt to represent different soils (alluvial and sandy) as well as different source of contaminated wastewater (sewage and industrial effluent). Using sequential extraction technique (extracting the soil with different solutions, which is designed to separate metal fractions), Cr was separated into six operationally defined fractions water soluble, exchangeable, carbonate bound, Fe-Mn oxides bound, organic bound and residual fractions. Result of soil total-Cr indicated the serious accumulation of Cr in soils subjected to prolonged irrigation with contaminated wastewater. As it could seen, total-Cr in the tested contaminated soils exceeds the permissible levels (75-100)ppm Cr by several order of magnitude particularly at the surface and subsurface layers. The highest accumulation of total Cr down to depth 60 cm was observed in case of soil E. Data showed that values of total Cr determined by NAA method were always higher than the relevant values determined either by AAS or those calculated after the sequential extraction method. T-test analysis showed the significant difference between NAA and either AAS or sequential extraction methods. Although T-test analysis showed that were significant differences between total content in soils as determined by destructive (AAS or SUM) and non-destructive (NAA) analytical techniques however, strong liner relation between NAA and other tested methods was obtained. Chromium distribution between different extractants shows that the greatest amounts are found in the residual and Occluded in Fe and Mn-Oxides fractions followed by carbonate or organic fractions. In most cases the proportion of all tested Cr-forms has increased in contaminated soil layers with higher enrichment in organically bound Cr, occluded in Fe and Mn oxides, carbonate exchangeable and soluble fractions. Results indicate that soil properties have a

  11. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  12. Chemometric Analysis for Pollution Source Assessment of Harbour Sediments in Arctic Locations

    DEFF Research Database (Denmark)

    Pedersen, Kristine B.; Lejon, Tore; Jensen, Pernille Erland

    2015-01-01

    Pollution levels, pollutant distribution and potential source assessments based on multivariate analysis (chemometrics) were made for harbour sediments from two Arctic locations; Hammerfest in Norway and Sisimiut in Greenland. High levels of heavy metals were detected in addition to organic...... pollutants. Preliminary assessments based on principal component analysis (PCA) revealed different sources and pollutant distribution in the sediments of the two harbours. Tributyltin (TBT) was, however, found to originate from point source(s), and the highest concentrations of TBT in both harbours were...... indicated relation primarily to German, Russian and American mixtures in Hammerfest; and American, Russian and Japanese mixtures in Sisimiut. PCA was shown to be an important tool for identifying pollutant sources and differences in pollutant composition in relation to sediment characteristics....

  13. Case studies in the application of probabilistic safety assessment techniques to radiation sources. Final report of a coordinated research project 2001-2003

    International Nuclear Information System (INIS)

    2006-04-01

    Radiation sources are used worldwide in many industrial and medical applications. In general, the safety record associated with their use has been very good. However, accidents involving these sources have occasionally resulted in unplanned exposures to individuals. When assessed prospectively, this type of exposure is termed a 'potential exposure'. The International Commission on Radiological Protection (ICRP) has recommended the assessment of potential exposures that may result from radiation sources and has suggested that probabilistic safety assessment (PSA) techniques may be used in this process. Also, Paragraph 2.13 of the International Basic Safety Standards for Protection against Ionizing Radiation and for the Safety of Radiation Sources (BSS) requires that the authorization process for radiation sources include an assessment of all exposures, including potential exposures, which may result from the use of a radiation source. In light of the ICRP's work described above, and the possibility that PSA techniques could be used in exposure assessments that are required by the BSS, the IAEA initiated a coordinated research project (CRP) to study the benefits and limitations of the application of PSA techniques to radiation sources. The results of this CRP are presented in this publication. It should be noted that these results are based solely on the work performed, and the conclusions drawn, by the research teams involved in this CRP. It is intended that international organizations involved in radiation protection will review the information in this report and will take account of it during the development of guidance and requirements related to the assessment of potential exposures from radiation sources. Also, it is anticipated that the risk insights obtained through the studies will be considered by medical practitioners, facility staff and management, equipment designers, and regulators in their safety management and risk evaluation activities. A draft

  14. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  15. Neutronics of the IFMIF neutron source: development and analysis

    International Nuclear Information System (INIS)

    Wilson, P.P.H.

    1999-01-01

    The accurate analysis of this system required the development of a code system and methodology capable of modelling the various physical processes. A generic code system for the neutronics analysis of neutron sources has been created by loosely integrating existing components with new developments: the data processing code NJOY, the Monte Carlo neutron transport code MCNP, and the activation code ALARA were supplemented by a damage data processing program, damChar, and integrated with a number of flexible and extensible modules for the Perl scripting language. Specific advances were required to apply this code system to IFMIF. Based on the ENDF-6 data format requirements of this system, new data evaluations have been implemented for neutron transport and activation. Extensive analysis of the Li(d, xn) reaction has led to a new MCNP source function module, M c DeLi, based on physical reaction models and capable of accurate and flexible modelling of the IFMIF neutron source term. In depth analyses of the neutron flux spectra and spatial distribution throughout the high flux test region permitted a basic validation of the tools and data. The understanding of the features of the neutron flux provided a foundation for the analyses of the other neutron responses. (orig./DGE) [de

  16. Investigation of neutron guide systems: Analysis techniques and an experiment

    International Nuclear Information System (INIS)

    Kudryashev, V.A.

    1991-01-01

    This paper discusses the in-depth study of the specific characteristics of the physical processes associated with the total reflection of neutrons from actual reflective coatings; the study of the process whereby neutrons transit a nonideal image channel with allowance for the aforementioned characteristics, and; the development of physical criteria and techniques for calculating the optimum geometry of a neutron guide source system based on the laws found to govern this transit process

  17. Fast neutron and gamma-ray transmission technique in mixed samples. MCNP calculations

    International Nuclear Information System (INIS)

    Perez, N.; Padron, I.

    2001-01-01

    In this paper the moisture in sand and also the sulfur content in toluene have been described by using the simultaneous fast neutron/gamma transmission technique (FNGT). Monte Carlo calculations show that it is possible to apply this technique with accelerator-based and isotopic neutron sources in the on-line analysis to perform the product quality control, specifically in the building materials industry and the petroleum one. It has been used particles from a 14MeV neutron generator and also from an Am-Be neutron source. The estimation of optimal system parameters like the efficiency, detection time, hazards and costs were performed in order to compare both neutron sources

  18. Polycyclic aromatic hydrocarbons in urban air : concentration levels and patterns and source analysis in Nairobi, Kenya

    Energy Technology Data Exchange (ETDEWEB)

    Muthini, M.; Yoshimichi, H.; Yutaka, K.; Shigeki, M. [Yokohama National Univ., Yokohama (Japan). Graduate School of Environment and Information Sciences

    2005-07-01

    Polycyclic aromatic hydrocarbons (PAHs) present in the environment are often the result of incomplete combustion processes. This paper reported concentration levels and patterns of high molecular weight PAHs in Nairobi, Kenya. Daily air samples for 30 different PAHs were collected at residential, industrial and business sites within the city. Samples were then extracted using deuterated PAH with an automated Soxhlet device. Gas chromatography and mass spectrometry (GC-MS) with a capillary column was used to analyze the extracts using a selected ion monitoring (SIM) mode. Statistical analyses were then performed. PAH concentration levels were reported for average, median, standard deviation, range, and Pearson's correlation coefficients. Data were then analyzed for sources using a principal component analysis (PCA) technique and isomer ratio analysis. Nonparametric testing was then conducted to detect inherent differences in PAH concentration data obtained from the different sites. Results showed that pyrene was the most abundant PAH. Carcinogenic PAHs were higher in high-traffic areas. The correlation coefficient between coronene and benzo(ghi)pyrene was high. The PAH isomer ratio analysis demonstrated that PAHs in Nairobi are the product of traffic emissions and oil combustion. Results also showed that PAH profiles were not well separated. It was concluded that source distinction methods must be improved in order to better evaluate PAH emissions in the city. 9 refs., 2 tabs., 1 fig.

  19. Activation analysis of stainless steel flux monitors using 252Cf neutron sources

    International Nuclear Information System (INIS)

    Williams, J.G.; Newton, T.H. Jr.; Cogburn, C.O.

    1984-01-01

    Activation analysis was performed on stainless steel beads from a chain which is used in reactor pressure vessel surveillance experiments at the Arkansas Power and Light Company reactors. The beads allow monitoring of two fast and three thermal neutron induced reactions: 58 Ni(n,p) 58 Co, 54 Fe(n,p) 54 Mn, 58 Fe(n,γ) 59 Fe, 59 Co(n,γ) 60 Co and 50 Cr(n,γ) 51 Cr. The analysis was performed using 12 beads from various positions along 5 different batches of chain and standard materials in an H 2 O moderator tank using two intense californium sources which had a total neutron emission rate of 3.97 x 10 10 /s. Semiconductor gamma spectrometers were used to count the products of the above reactions in the specimens. The percentage by weight of the iron, chromium and cobalt in the beads were found to be 62.1%, 20.2% and 0.120%, respectively. The excellent uniformity found in the bead compositions demonstrates the reproducibility of the experimental techniques and enhances considerably the value of the beads as neutron flux montitors

  20. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H S; Kim, J H; Lee, J C; Choi, Y R; Moon, S S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  1. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S.

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  2. Performance analysis of a full-field and full-range swept-source OCT system

    Science.gov (United States)

    Krauter, J.; Boettcher, T.; Körner, K.; Gronle, M.; Osten, W.; Passilly, N.; Froehly, L.; Perrin, S.; Gorecki, C.

    2015-09-01

    In recent years, optical coherence tomography (OCT) became gained importance in medical disciplines like ophthalmology, due to its noninvasive optical imaging technique with micrometer resolution and short measurement time. It enables e. g. the measurement and visualization of the depth structure of the retina. In other medical disciplines like dermatology, histopathological analysis is still the gold standard for skin cancer diagnosis. The EU-funded project VIAMOS (Vertically Integrated Array-type Mirau-based OCT System) proposes a new type of OCT system combined with micro-technologies to provide a hand-held, low-cost and miniaturized OCT system. The concept is a combination of full-field and full-range swept-source OCT (SS-OCT) detection in a multi-channel sensor based on a micro-optical Mirau-interferometer array, which is fabricated by means of wafer fabrication. This paper presents the study of an experimental proof-of-concept OCT system as a one-channel sensor with bulk optics. This sensor is a Linnik-interferometer type with similar optical parameters as the Mirau-interferometer array. A commercial wavelength tunable light source with a center wavelength at 845nm and 50nm spectral bandwidth is used with a camera for parallel OCT A-Scan detection. In addition, the reference microscope objective lens of the Linnik-interferometer is mounted on a piezo-actuated phase-shifter. Phase-shifting interferometry (PSI) techniques are applied for resolving the conjugate complex artifact and consequently contribute to an increase of image quality and depth range. A suppression ratio of the complex conjugate term of 36 dB is shown and a system sensitivity greater than 96 dB could be measured.

  3. Comparative study of two drying techniques used in radioactive source preparation: Freeze-drying and evaporation using hot dry nitrogen jets

    International Nuclear Information System (INIS)

    Branger, T.; Bobin, C.; Iroulart, M.-G.; Lepy, M.-C.; Le Garreres, I.; Morelli, S.; Lacour, D.; Plagnard, J.

    2008-01-01

    Quantitative solid sources are used widely in the field of radionuclide metrology. With the aim to improve the detection efficiency for electrons and x-rays, a comparative study between two source drying techniques has been undertaken at LNE-Laboratoire National Henri Becquerel (LNE-LNHB, France). In this paper, freeze-drying using commercial equipment is compared with a system of drying using hot jets of nitrogen developed at Institute for Reference Materials and Measurements (IRMM, Belgium). In order to characterize the influence of self-absorption, the detection efficiencies for 51 Cr sources have been measured by coincidence counting and photon spectrometry

  4. Source Evaluation and Trace Metal Contamination in Benthic Sediments from Equatorial Ecosystems Using Multivariate Statistical Techniques.

    Directory of Open Access Journals (Sweden)

    Nsikak U Benson

    Full Text Available Trace metals (Cd, Cr, Cu, Ni and Pb concentrations in benthic sediments were analyzed through multi-step fractionation scheme to assess the levels and sources of contamination in estuarine, riverine and freshwater ecosystems in Niger Delta (Nigeria. The degree of contamination was assessed using the individual contamination factors (ICF and global contamination factor (GCF. Multivariate statistical approaches including principal component analysis (PCA, cluster analysis and correlation test were employed to evaluate the interrelationships and associated sources of contamination. The spatial distribution of metal concentrations followed the pattern Pb>Cu>Cr>Cd>Ni. Ecological risk index by ICF showed significant potential mobility and bioavailability for Cu, Cu and Ni. The ICF contamination trend in the benthic sediments at all studied sites was Cu>Cr>Ni>Cd>Pb. The principal component and agglomerative clustering analyses indicate that trace metals contamination in the ecosystems was influenced by multiple pollution sources.

  5. Pteros: fast and easy to use open-source C++ library for molecular analysis.

    Science.gov (United States)

    Yesylevskyy, Semen O

    2012-07-15

    An open-source Pteros library for molecular modeling and analysis of molecular dynamics trajectories for C++ programming language is introduced. Pteros provides a number of routine analysis operations ranging from reading and writing trajectory files and geometry transformations to structural alignment and computation of nonbonded interaction energies. The library features asynchronous trajectory reading and parallel execution of several analysis routines, which greatly simplifies development of computationally intensive trajectory analysis algorithms. Pteros programming interface is very simple and intuitive while the source code is well documented and easily extendible. Pteros is available for free under open-source Artistic License from http://sourceforge.net/projects/pteros/. Copyright © 2012 Wiley Periodicals, Inc.

  6. HPAT: A nondestructive analysis technique for plutonium and uranium solutions

    International Nuclear Information System (INIS)

    Aparo, M.; Mattia, B.; Zeppa, P.; Pagliai, V.; Frazzoli, F.V.

    1989-03-01

    Two experimental approaches for the nondestructive characterization of mixed solutions of plutonium and uranium, developed at BNEA - C.R.E. Casaccia, with the goal of measuring low plutonium concentration (<50 g/l) even in presence of high uranium content, are described in the following. Both methods are referred to as HPAT (Hybrid Passive-Active Technique) since they rely on the measurement of plutonium spontaneous emission in the LX-rays energy region as well as the transmission of KX photons from the fluorescence induced by a radioisotopic source on a suitable target. Experimental campaigns for the characterization of both techniques have been carried out at EUREX Plant Laboratories (C.R.E. Saluggia) and at Plutonium Plant Laboratories (C.R.E. Casaccia). Experimental results and theoretical value of the errors are reported. (author)

  7. Techniques for enhancing the performance of high charge state ECR ion sources

    International Nuclear Information System (INIS)

    Xie, Z.Q.

    1999-01-01

    Electron Cyclotron Resonance ion source (ECRIS), which produces singly to highly charged ions, is widely used in heavy ion accelerators and is finding applications in industry. It has progressed significantly in recent years thanks to a few techniques, such as multiple-frequency plasma heating, higher mirror magnetic fields and a better cold electron donor. These techniques greatly enhance the production of highly charged ions. More than 1 emA of He 2+ and O 6+ , hundreds of eμA of O 7+ , Ne 8+ , Ar 12+ , more than 100 eμA of intermediate heavy ions with charge states up to Ne 9+ , Ar 13+ , Ca 13+ , Fe 13+ , Co 14+ and Kr 18+ , tens of eμA of heavy ions with charge states up to Xe 28+ , Au 35+ , Bi 34+ and U 34+ were produced at cw mode operation. At an intensity of about 1 eμA, the charge states for the heavy ions increased up to Xe 36+ , Au 46+ , Bi 47+ and U 48+ . More than an order of magnitude enhancement of fully stripped argon ions was achieved (I≥0.1 and h;eμA). Higher charge state ions up to Kr 35+ , Xe 46+ and U 64+ at low intensities were produced for the first time from an ECRIS. copyright 1999 American Institute of Physics

  8. Nuclear analytical techniques and their application to environmental samples

    International Nuclear Information System (INIS)

    Lieser, K.H.

    1986-01-01

    A survey is given on nuclear analytical techniques and their application to environmental samples. Measurement of the inherent radioactivity of elements or radionuclides allows determination of natural radioelements (e.g. Ra), man-made radioelements (e.g. Pu) and radionuclides in the environment. Activation analysis, in particular instrumental neutron activation analysis, is a very reliable and sensitive method for determination of a great number of trace elements in environmental samples, because the most abundant main constituents are not activated. Tracer techniques are very useful for studies of the behaviour and of chemical reactions of trace elements and compounds in the environment. Radioactive sources are mainly applied for excitation of characteristic X-rays (X-ray fluorescence analysis). (author)

  9. A Comparison of seismic instrument noise coherence analysis techniques

    Science.gov (United States)

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  10. GEOSPATIAL ANALYSIS OF ATMOSPHERIC HAZE EFFECT BY SOURCE AND SINK LANDSCAPE

    Directory of Open Access Journals (Sweden)

    T. Yu

    2017-09-01

    Full Text Available Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents

  11. Geospatial Analysis of Atmospheric Haze Effect by Source and Sink Landscape

    Science.gov (United States)

    Yu, T.; Xu, K.; Yuan, Z.

    2017-09-01

    Based on geospatial analysis model, this paper analyzes the relationship between the landscape patterns of source and sink in urban areas and atmospheric haze pollution. Firstly, the classification result and aerosol optical thickness (AOD) of Wuhan are divided into a number of square grids with the side length of 6 km, and the category level landscape indices (PLAND, PD, COHESION, LPI, FRAC_MN) and AOD of each grid are calculated. Then the source and sink landscapes of atmospheric haze pollution are selected based on the analysis of the correlation between landscape indices and AOD. Next, to make the following analysis more efficient, the indices selected before should be determined through the correlation coefficient between them. Finally, due to the spatial dependency and spatial heterogeneity of the data used in this paper, spatial autoregressive model and geo-weighted regression model are used to analyze atmospheric haze effect by source and sink landscape from the global and local level. The results show that the source landscape of atmospheric haze pollution is the building, and the sink landscapes are shrub and woodland. PLAND, PD and COHESION are suitable for describing the atmospheric haze effect by source and sink landscape. Comparing these models, the fitting effect of SLM, SEM and GWR is significantly better than that of OLS model. The SLM model is superior to the SEM model in this paper. Although the fitting effect of GWR model is more unsuited than that of SLM, the influence degree of influencing factors on atmospheric haze of different geography can be expressed clearer. Through the analysis results of these models, following conclusions can be summarized: Reducing the proportion of source landscape area and increasing the degree of fragmentation could cut down aerosol optical thickness; And distributing the source and sink landscape evenly and interspersedly could effectively reduce aerosol optical thickness which represents atmospheric haze

  12. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  13. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  14. Study of uranium mineralization in rock samples from marwat range bannu basin by fission track analysis technique

    International Nuclear Information System (INIS)

    Qureshi, A.Z.; Ullah, K.; Ullah, N.; Akram, M.

    2004-07-01

    The Geophysics Division, Atomic Energy Minerals Centre (AEMC), Lahore has planned a uranium exploration program in Marwat Range, Bannu Basin. In this connection 30 thin sections of rock samples, collected from four areas; namely, Darra Tang, Simukili, Karkanwal and Sheikhillah from Marwat Range, and one from Salt Range were provided to Nuclear Geology Group of Physics Research Division, PINSTECH for the study of nature and mechanism of uranium mineralization These studies are aimed to help in designing uranium exploration strategy by providing the loci of uranium sources in the Marwat and Salt Ranges. The samples have been studied using fission track analysis technique. (author)

  15. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  16. A one-step technique to prepare aligned arrays of carbon nanotubes

    Energy Technology Data Exchange (ETDEWEB)

    Mahanandia, Pitamber [Department of Physics, Indian Institute of Science, Bangalore 560012 (India); Nanda, Karuna Kar [Materials Research Centre, Indian Institute of Science, Bangalore 560012 (India)], E-mail: pitam@physics.iisc.ernet.in

    2008-04-16

    A simple effective pyrolysis technique has been developed to synthesize aligned arrays of multi-walled carbon nanotubes (MWCNTs) without using any carrier gas in a single-stage furnace at 700 deg. C. This technique eliminates nearly the entire complex and expensive machinery associated with other extensively used methods for preparation of CNTs such as chemical vapour deposition (CVD) and pyrolysis. Carbon source materials such as xylene, cyclohexane, camphor, hexane, toluene, pyridine and benzene have been pyrolyzed separately with the catalyst source material ferrocene to obtain aligned arrays of MWCNTs. The synthesized CNTs have been characterized by scanning electron microscopy (SEM), x-ray diffraction (XRD), transmission electron microscopy (TEM), thermogravimetric analysis (TGA) and Raman spectroscopy. In this technique, the need for the tedious and time-consuming preparation of metal catalysts and continuously fed carbon source material containing carrier gas can be avoided. This method is a single-step process where not many parameters are required to be monitored in order to prepare aligned MWCNTs. For the production of CNTs, the technique has great advantages such as low cost and easy operation.

  17. The Statistical Analysis Techniques to Support the NGNP Fuel Performance Experiments

    International Nuclear Information System (INIS)

    Pham, Bihn T.; Einerson, Jeffrey J.

    2010-01-01

    This paper describes the development and application of statistical analysis techniques to support the AGR experimental program on NGNP fuel performance. The experiments conducted in the Idaho National Laboratory's Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel/graphite temperature) is regulated by the He-Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the SAS-based NGNP Data Management and Analysis System (NDMAS) for automated processing and qualification of the AGR measured data. The NDMAS also stores daily neutronic (power) and thermal (heat transfer) code simulation results along with the measurement data, allowing for their combined use and comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the target quantity (fuel temperature) within a given range.

  18. The statistical analysis techniques to support the NGNP fuel performance experiments

    Energy Technology Data Exchange (ETDEWEB)

    Pham, Binh T., E-mail: Binh.Pham@inl.gov; Einerson, Jeffrey J.

    2013-10-15

    This paper describes the development and application of statistical analysis techniques to support the Advanced Gas Reactor (AGR) experimental program on Next Generation Nuclear Plant (NGNP) fuel performance. The experiments conducted in the Idaho National Laboratory’s Advanced Test Reactor employ fuel compacts placed in a graphite cylinder shrouded by a steel capsule. The tests are instrumented with thermocouples embedded in graphite blocks and the target quantity (fuel temperature) is regulated by the He–Ne gas mixture that fills the gap volume. Three techniques for statistical analysis, namely control charting, correlation analysis, and regression analysis, are implemented in the NGNP Data Management and Analysis System for automated processing and qualification of the AGR measured data. The neutronic and thermal code simulation results are used for comparative scrutiny. The ultimate objective of this work includes (a) a multi-faceted system for data monitoring and data accuracy testing, (b) identification of possible modes of diagnostics deterioration and changes in experimental conditions, (c) qualification of data for use in code validation, and (d) identification and use of data trends to support effective control of test conditions with respect to the test target. Analysis results and examples given in the paper show the three statistical analysis techniques providing a complementary capability to warn of thermocouple failures. It also suggests that the regression analysis models relating calculated fuel temperatures and thermocouple readings can enable online regulation of experimental parameters (i.e. gas mixture content), to effectively maintain the fuel temperature within a given range.

  19. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  20. Sources and speciation of heavy metals in municipal solid waste (MSW) and its effect on the separation technique

    Energy Technology Data Exchange (ETDEWEB)

    Biollaz, S; Ludwig, Ch; Stucki, S [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    A literature search was carried out to determine sources and speciation of heavy metals in MSW. A combination of thermal and mechanical separation techniques is necessary to achieve the required high degrees of metal separation. Metallic goods should be separated mechanically, chemically bound heavy metals by a thermal process. (author) 1 fig., 1 tab., 6 refs.

  1. Speckle noise reduction technique for Lidar echo signal based on self-adaptive pulse-matching independent component analysis

    Science.gov (United States)

    Xu, Fan; Wang, Jiaxing; Zhu, Daiyin; Tu, Qi

    2018-04-01

    Speckle noise has always been a particularly tricky problem in improving the ranging capability and accuracy of Lidar system especially in harsh environment. Currently, effective speckle de-noising techniques are extremely scarce and should be further developed. In this study, a speckle noise reduction technique has been proposed based on independent component analysis (ICA). Since normally few changes happen in the shape of laser pulse itself, the authors employed the laser source as a reference pulse and executed the ICA decomposition to find the optimal matching position. In order to achieve the self-adaptability of algorithm, local Mean Square Error (MSE) has been defined as an appropriate criterion for investigating the iteration results. The obtained experimental results demonstrated that the self-adaptive pulse-matching ICA (PM-ICA) method could effectively decrease the speckle noise and recover the useful Lidar echo signal component with high quality. Especially, the proposed method achieves 4 dB more improvement of signal-to-noise ratio (SNR) than a traditional homomorphic wavelet method.

  2. Study on development and actual application of scientific crime detection technique using small scale neutron radiation source

    International Nuclear Information System (INIS)

    Suzuki, Yasuhiro; Kishi, Toru; Tachikawa, Noboru; Ishikawa, Isamu.

    1997-01-01

    PGA (Prompt γ-ray Analysis) is an analytic method of γ-ray generated from atomic nuclei of elements in the specimen just after irradiation (within 10(exp-14)sec.) of neutron to it. As using neutron with excellent transmission for an exciting source, this method can be used for inspecting the matters in closed containers non-destructively, and can also detect non-destructively light elements such as boron, nitrogen and others difficult by other non-destructive analysis. Especially, it is found that this method can detect such high concentration of nitrogen, chlorine and others which are characteristic elements for the explosives. However, as there are a number of limitations at the nuclear reactor site, development of an analytical apparatus for small scale neutron radiation source was begun, at first. In this fiscal year, analysis of the light elements such as nitrogen, chlorine and others using PGA was attempted by using 252-Cf as the simplest neutron source in its operation. As the 252-Cf neutron flux was considerably lower than that of nuclear reactor, its analytical sensitivity was also investigated. (G.K.)

  3. Nuclear activation techniques in the life sciences

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1978-08-15

    The analysis of the elemental composition of biological materials is presently undertaken on a large scale in many countries around the world One recent estimate puts the number of such analyses at six thousand million single-element determinations per year, of which about sixteen million are for the so-called trace elements. Since many of these elements are known to play an important role in relation to health and disease, there is considerable interest in learning more about the ways in which they function in living organisms. Nuclear activation techniques, generally referred to collectively as 'activation analysis' constitute an important group of methods for the analysis of the elemental composition of biological materials. Generally they rely on the use of a research nuclear reactor as a source of neutrons for bombarding small samples of biological material, followed by a measurement of the induced radioactivity to provide an estimate of the concentrations of elements. Other methods of activation with Bremsstrahlung and charged particles may also be used, and have their own special applications. These methods of in vitro analysis are particularly suitable for the study of trace elements. Another important group of methods makes use of neutrons from isotopic neutron sources or neutron generators to activate the whole body, or a part of the body, of a living patient. They are generally used for the study of major elements such as Ca, Na and N. All these techniques have previously been the subject of two symposia organised by the IAEA in 1967 and 1972. The present meeting was held to review some of the more recent developments in this field and also to provide a viewpoint on the current status of nuclear activation techniques vis-a-vis other competing non-nuclear methods of analysis.

  4. Control charts technique - a tool to data analysis for chemical experiments

    International Nuclear Information System (INIS)

    Yadav, M.B.; Venugopal, V.

    1999-01-01

    A procedure using control charts technique has been developed to analyse data of a chemical experiment which was conducted to assign a value to uranium content in Rb 2 U(SO 4 ) 3 . A value of (34.164 ± 0.031)% has been assigned against (34.167 ± 0.042)% already assigned by analysis of variance (ANOVA) technique. These values do not differ significantly. Merits and demerits of the two techniques have been discussed. (author)

  5. Ambiguity of source location in acoustic emission technique

    International Nuclear Information System (INIS)

    Barat, P.; Mukherjee, P.; Kalyanasundaram, P.; Raj, B.

    1996-01-01

    Location of acoustic emission (AE) source in a plane is detected from the difference of the arrival times of the AE signal to at least three sensors placed on it. The detected location may not be unique in all cases. In this paper, the condition for the unambiguous solution for the location of the source has been deduced mathematically in terms of arrival times of the AE signal, the coordinate of the three sensors and the acoustic velocity. (author)

  6. Meeting on risk and monitoring analysis techniques for food safety - RLA/5/060/ARCAL Project (ARCAL CXXVIII): sampling plans and introduction to chemical risk assessment in food innocuousness

    International Nuclear Information System (INIS)

    2013-03-01

    Some of the Latinoamerican countries such us Bolivia, Colombia, Uruguay and Venezuela participant in the meeting gave an exposition about the risk analysis and monitoring techniques in food safety in their countyries. With the aim to study components of risk analysis, food innocuousness, evaluation and chemical dangers, toxicity, exposure, change of paradigms in the global food system, data sources, study in animals and in vitro, sensitivity analysis, risk assessment in health it carried out the meeting

  7. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  8. Your Personal Analysis Toolkit - An Open Source Solution

    Science.gov (United States)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  9. Java Source Code Analysis for API Migration to Embedded Systems

    Energy Technology Data Exchange (ETDEWEB)

    Winter, Victor [Univ. of Nebraska, Omaha, NE (United States); McCoy, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guerrero, Jonathan [Univ. of Nebraska, Omaha, NE (United States); Reinke, Carl Werner [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Perry, James Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-02-01

    Embedded systems form an integral part of our technological infrastructure and oftentimes play a complex and critical role within larger systems. From the perspective of reliability, security, and safety, strong arguments can be made favoring the use of Java over C in such systems. In part, this argument is based on the assumption that suitable subsets of Java’s APIs and extension libraries are available to embedded software developers. In practice, a number of Java-based embedded processors do not support the full features of the JVM. For such processors, source code migration is a mechanism by which key abstractions offered by APIs and extension libraries can made available to embedded software developers. The analysis required for Java source code-level library migration is based on the ability to correctly resolve element references to their corresponding element declarations. A key challenge in this setting is how to perform analysis for incomplete source-code bases (e.g., subsets of libraries) from which types and packages have been omitted. This article formalizes an approach that can be used to extend code bases targeted for migration in such a manner that the threats associated the analysis of incomplete code bases are eliminated.

  10. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    Energy Technology Data Exchange (ETDEWEB)

    Klumpp, John [Colorado State University, Department of Environmental and Radiological Health Sciences, Molecular and Radiological Biosciences Building, Colorado State University, Fort Collins, Colorado, 80523 (United States)

    2013-07-01

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements from an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)

  11. A nuclear source term analysis for spacecraft power systems

    International Nuclear Information System (INIS)

    McCulloch, W.H.

    1998-01-01

    All US space missions involving on board nuclear material must be approved by the Office of the President. To be approved the mission and the hardware systems must undergo evaluations of the associated nuclear health and safety risk. One part of these evaluations is the characterization of the source terms, i.e., the estimate of the amount, physical form, and location of nuclear material, which might be released into the environment in the event of credible accidents. This paper presents a brief overview of the source term analysis by the Interagency Nuclear Safety Review Panel for the NASA Cassini Space Mission launched in October 1997. Included is a description of the Energy Interaction Model, an innovative approach to the analysis of potential releases from high velocity impacts resulting from launch aborts and reentries

  12. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  13. A Method for the Analysis of Information Use in Source-Based Writing

    Science.gov (United States)

    Sormunen, Eero; Heinstrom, Jannica; Romu, Leena; Turunen, Risto

    2012-01-01

    Introduction: Past research on source-based writing assignments has hesitated to scrutinize how students actually use information afforded by sources. This paper introduces a method for the analysis of text transformations from sources to texts composed. The method is aimed to serve scholars in building a more detailed understanding of how…

  14. Comparison of residual NAPL source removal techniques in 3D metric scale experiments

    Science.gov (United States)

    Atteia, O.; Jousse, F.; Cohen, G.; Höhener, P.

    2017-07-01

    This study compared four treatment techniques for the removal of a toluene/n-decane as NAPL (Non Aqueous Phase Liquid) phase mixture in identical 1 cubic meter tanks filled with different kind of sand. These four treatment techniques were: oxidation with persulfate, surfactant washing with Tween80®, sparging with air followed by ozone, and thermal treatment at 80 °C. The sources were made with three lenses of 26 × 26 × 6.5 cm, one having a hydraulic conductivity similar to the whole tank and the two others a value 10 times smaller. The four techniques were studied after conditioning the tanks with tap water during approximately 80 days. The persulfate treatment tests showed average removal of the contaminants but significant flux decrease if density effects are considered. Surfactant flushing did not show a highly significant increase of the flux of toluene but allowed an increased removal rate that could lead to an almost complete removal with longer treatment time. Sparging removed a significant amount but suggests that air was passing through localized gas channels and that the removal was stagnating after removing half of the contamination. Thermal treatment reached 100% removal after the target temperature of 80 °C was kept during more than 10 d. The experiments emphasized the generation of a high-spatial heterogeneity in NAPL content. For all the treatments the overall removal was similar for both n-decane and toluene, suggesting that toluene was removed rapidly and n-decane more slowly in some zones, while no removal existed in other zones. The oxidation and surfactant results were also analyzed for the relation between contaminant fluxes at the outlet and mass removal. For the first time, this approach clearly allowed the differentiation of the treatments. As a conclusion, experiments showed that the most important differences between the tested treatment techniques were not the global mass removal rates but the time required to reach 99% decrease in

  15. Techniques of sample attack used in soil and mineral analysis. Phase I

    International Nuclear Information System (INIS)

    Chiu, N.W.; Dean, J.R.; Sill, C.W.

    1984-07-01

    Several techniques of sample attack for the determination of radioisotopes are reviewed. These techniques include: 1) digestion with nitric or hydrochloric acid in Parr digestion bomb, 2) digestion with a mixture of nitric and hydrochloric acids, 3) digestion with a mixture of hydrofluoric, nitric and perchloric acids, and 4) fusion with sodium carbonate, potassium fluoride or alkali pyrosulfates. The effectiveness of these techniques to decompose various soils and minerals containing radioisotopes such as lead-210 uranium, thorium and radium-226 are discussed. The combined procedure of potassium fluoride fusion followed by alkali pyrosulfate fusion is recommended for radium-226, uranium and thorium analysis. This technique guarantees the complete dissolution of samples containing refractory materials such as silica, silicates, carbides, oxides and sulfates. For the lead-210 analysis, the procedure of digestion with a mixture of hydrofluoric, nitric and perchloric acids followed by fusion with alkali pyrosulfate is recommended. These two procedures are detailed. Schemes for the sequential separation of the radioisotopes from a dissolved sample solution are outlined. Procedures for radiochemical analysis are suggested

  16. Survey of Techniques for Deep Web Source Selection and Surfacing the Hidden Web Content

    OpenAIRE

    Khushboo Khurana; M.B. Chandak

    2016-01-01

    Large and continuously growing dynamic web content has created new opportunities for large-scale data analysis in the recent years. There is huge amount of information that the traditional web crawlers cannot access, since they use link analysis technique by which only the surface web can be accessed. Traditional search engine crawlers require the web pages to be linked to other pages via hyperlinks causing large amount of web data to be hidden from the crawlers. Enormous data is available in...

  17. Analysis on the inbound tourist source market in Fujian Province

    Science.gov (United States)

    YU, Tong

    2017-06-01

    The paper analyzes the development and structure of inbound tourism in Fujian Province by Excel software and conducts the cluster analysis on the inbound tourism market by SPSS 23.0 software based on the inbound tourism data of Fujian Province from 2006 to 2015. The results show: the rapid development of inbound tourism in Fujian Province and the diversified inbound tourist source countries indicate the stability of inbound tourism market; the inbound tourist source market in Fujian Province can be divided into four categories according to the cluster analysis, and tourists from the United States, Japan, Malaysia, and Singapore are the key of inbound tourism in Fujian Province.

  18. Paleotempestological chronology developed from gas ion source AMS analysis of carbonates determined through real-time Bayesian statistical approach

    Science.gov (United States)

    Wallace, D. J.; Rosenheim, B. E.; Roberts, M. L.; Burton, J. R.; Donnelly, J. P.; Woodruff, J. D.

    2014-12-01

    Is a small quantity of high-precision ages more robust than a higher quantity of lower-precision ages for sediment core chronologies? AMS Radiocarbon ages have been available to researchers for several decades now, and precision of the technique has continued to improve. Analysis and time cost is high, though, and projects are often limited in terms of the number of dates that can be used to develop a chronology. The Gas Ion Source at the National Ocean Sciences Accelerator Mass Spectrometry Facility (NOSAMS), while providing lower-precision (uncertainty of order 100 14C y for a sample), is significantly less expensive and far less time consuming than conventional age dating and offers the unique opportunity for large amounts of ages. Here we couple two approaches, one analytical and one statistical, to investigate the utility of an age model comprised of these lower-precision ages for paleotempestology. We use a gas ion source interfaced to a gas-bench type device to generate radiocarbon dates approximately every 5 minutes while determining the order of sample analysis using the published Bayesian accumulation histories for deposits (Bacon). During two day-long sessions, several dates were obtained from carbonate shells in living position in a sediment core comprised of sapropel gel from Mangrove Lake, Bermuda. Samples were prepared where large shells were available, and the order of analysis was determined by the depth with the highest uncertainty according to Bacon. We present the results of these analyses as well as a prognosis for a future where such age models can be constructed from many dates that are quickly obtained relative to conventional radiocarbon dates. This technique currently is limited to carbonates, but development of a system for organic material dating is underway. We will demonstrate the extent to which sacrificing some analytical precision in favor of more dates improves age models.

  19. A Search Technique for Weak and Long-Duration Gamma-Ray Bursts from Background Model Residuals

    Science.gov (United States)

    Skelton, R. T.; Mahoney, W. A.

    1993-01-01

    We report on a planned search technique for Gamma-Ray Bursts too weak to trigger the on-board threshold. The technique is to search residuals from a physically based background model used for analysis of point sources by the Earth occultation method.

  20. Implementation of inter-unit analysis for C and C++ languages in a source-based static code analyzer

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The proliferation of automated testing capabilities arises a need for thorough testing of large software systems, including system inter-component interfaces. The objective of this research is to build a method for inter-procedural inter-unit analysis, which allows us to analyse large and complex software systems including multi-architecture projects (like Android OS as well as to support complex assembly systems of projects. Since the selected Clang Static Analyzer uses source code directly as input data, we need to develop a special technique to enable inter-unit analysis for such analyzer. This problem is of special nature because of C and C++ language features that assume and encourage the separate compilation of project files. We describe the build and analysis system that was implemented around Clang Static Analyzer to enable inter-unit analysis and consider problems related to support of complex projects. We also consider the task of merging abstract source trees of translation units and its related problems such as handling conflicting definitions, complex build systems and complex projects support, including support for multi-architecture projects, with examples. We consider both issues related to language design and human-related mistakes (that may be intentional. We describe some heuristics that were used for this work to make the merging process faster. The developed system was tested using Android OS as the input to show it is applicable even for such complicated projects. This system does not depend on the inter-procedural analysis method and allows the arbitrary change of its algorithm.

  1. Gas Source Techniques for Molecular Beam Epitaxy of Highly Mismatched Ge Alloys

    Directory of Open Access Journals (Sweden)

    Chad A. Stephenson

    2016-12-01

    Full Text Available Ge and its alloys are attractive candidates for a laser compatible with silicon integrated circuits. Dilute germanium carbide (Ge1−xCx offers a particularly interesting prospect. By using a precursor gas with a Ge4C core, C can be preferentially incorporated in substitutional sites, suppressing interstitial and C cluster defects. We present a method of reproducible and upscalable gas synthesis of tetrakis(germylmethane, or (H3Ge4C, followed by the design of a hybrid gas/solid-source molecular beam epitaxy system and subsequent growth of defect-free Ge1−xCx by molecular beam epitaxy (MBE. Secondary ion mass spectroscopy, transmission electron microscopy and contactless electroreflectance confirm the presence of carbon with very high crystal quality resulting in a decrease in the direct bandgap energy. This technique has broad applicability to growth of highly mismatched alloys by MBE.

  2. EU-FP7-iMARS: analysis of Mars multi-resolution images using auto-coregistration, data mining and crowd source techniques

    Science.gov (United States)

    Ivanov, Anton; Muller, Jan-Peter; Tao, Yu; Kim, Jung-Rack; Gwinner, Klaus; Van Gasselt, Stephan; Morley, Jeremy; Houghton, Robert; Bamford, Steven; Sidiropoulos, Panagiotis; Fanara, Lida; Waenlish, Marita; Walter, Sebastian; Steinkert, Ralf; Schreiner, Bjorn; Cantini, Federico; Wardlaw, Jessica; Sprinks, James; Giordano, Michele; Marsh, Stuart

    2016-07-01

    Understanding planetary atmosphere-surface and extra-terrestrial-surface formation processes within our Solar System is one of the fundamental goals of planetary science research. There has been a revolution in planetary surface observations over the last 15 years, especially in 3D imaging of surface shape. This has led to the ability to be able to overlay different epochs back in time to the mid 1970s, to examine time-varying changes, such as the recent discovery of mass movement, tracking inter-year seasonal changes and looking for occurrences of fresh craters. Within the EU FP-7 iMars project, UCL have developed a fully automated multi-resolution DTM processing chain, called the Co-registration ASP-Gotcha Optimised (CASP-GO), based on the open source NASA Ames Stereo Pipeline (ASP), which is being applied to the production of planetwide DTMs and ORIs (OrthoRectified Images) from CTX and HiRISE. Alongside the production of individual strip CTX & HiRISE DTMs & ORIs, DLR have processed HRSC mosaics of ORIs and DTMs for complete areas in a consistent manner using photogrammetric bundle block adjustment techniques. A novel automated co-registration and orthorectification chain has been developed and is being applied to level-1 EDR images taken by the 4 NASA orbital cameras since 1976 using the HRSC map products (both mosaics and orbital strips) as a map-base. The project has also included Mars Radar profiles from Mars Express and Mars Reconnaissance Orbiter missions. A webGIS has been developed for displaying this time sequence of imagery and a demonstration will be shown applied to one of the map-sheets. Automated quality control techniques are applied to screen for suitable images and these are extended to detect temporal changes in features on the surface such as mass movements, streaks, spiders, impact craters, CO2 geysers and Swiss Cheese terrain. These data mining techniques are then being employed within a citizen science project within the Zooniverse family

  3. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  4. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  5. Proposed Sources of Coaching Efficacy: A Meta-Analysis.

    Science.gov (United States)

    Myers, Nicholas D; Park, Sung Eun; Ahn, Soyeon; Lee, Seungmin; Sullivan, Philip J; Feltz, Deborah L

    2017-08-01

    Coaching efficacy refers to the extent to which a coach believes that he or she has the capacity to affect the learning and performance of his or her athletes. The purpose of the current study was to empirically synthesize findings across the extant literature to estimate relationships between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. A literature search yielded 20 studies and 278 effect size estimates that met the inclusion criteria. The overall relationship between the proposed sources of coaching efficacy and each dimension of coaching efficacy was positive and ranged from small to medium in size. Coach gender and level coached moderated the overall relationship between the proposed sources of coaching efficacy and each of the dimensions of coaching efficacy. Results from this meta-analysis provided some evidence for both the utility of, and possible revisions to, the conceptual model of coaching efficacy.

  6. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution

    International Nuclear Information System (INIS)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R.

    2015-01-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). - Highlights: • Fingerprint variability poses challenges in PAH source apportionment analysis. • PCA can be used to group compounds or cluster measurements. • PMF requires results validation but is useful for source suggestion. • Bayesian CMB provide practical and credible solution. - A Bayesian CMB model combined with PMF is a practical and credible fingerprints based PAH source apportionment method

  7. Application of rotating disk electrode technique for the preparation of Np, Pu and Am α-sources

    International Nuclear Information System (INIS)

    Tsoupko-Sitnikov, V.; Dayras, F.; Sanoit, J. de; Filossofov, D.

    2000-01-01

    Method of electrodeposition on rotating disk cathode (RDE) is applied for preparation of Np, Pu and Am α-standards. Phenomenon of critical current density is experimentally observed which is in perfect accord with Hansen's theory of electrodeposition. Influence of deposit calcination regime on quality of α-sources is studied, and comparison is made of uniformity of deposits obtained in various deposition systems. Standards with energy resolution better than 9 keV can be reproducibly obtained by optimized RDE electrodeposition technique

  8. The application of two recently developed human reliability techniques to cognitive error analysis

    International Nuclear Information System (INIS)

    Gall, W.

    1990-01-01

    Cognitive error can lead to catastrophic consequences for manned systems, including those whose design renders them immune to the effects of physical slips made by operators. Four such events, pressurized water and boiling water reactor accidents which occurred recently, were analysed. The analysis identifies the factors which contributed to the errors and suggests practical strategies for error recovery or prevention. Two types of analysis were conducted: an unstructured analysis based on the analyst's knowledge of psychological theory, and a structured analysis using two recently-developed human reliability analysis techniques. In general, the structured techniques required less effort to produce results and these were comparable to those of the unstructured analysis. (author)

  9. Sentiment Analysis in Geo Social Streams by using Machine Learning Techniques

    OpenAIRE

    Twanabasu, Bikesh

    2018-01-01

    Treball de Final de Màster Universitari Erasmus Mundus en Tecnologia Geoespacial (Pla de 2013). Codi: SIW013. Curs acadèmic 2017-2018 Massive amounts of sentiment rich data are generated on social media in the form of Tweets, status updates, blog post, reviews, etc. Different people and organizations are using these user generated content for decision making. Symbolic techniques or Knowledge base approaches and Machine learning techniques are two main techniques used for analysis sentiment...

  10. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  11. Survey of tritiated oil sources and handling practices

    International Nuclear Information System (INIS)

    Miller, J.M.

    1994-08-01

    Tritium interactions with oil sources (primarily associated with pumps) in tritium-handling facilities can lead to the incorporation of tritium in the oil and the production of tritiated hydrocarbons. This results in a source of radiological hazard and the need for special handling considerations during maintenance, decontamination, decommissioning and waste packaging and storage. The results of a general survey of tritiated-oil sources and their associated characteristics, handling practices, analysis techniques and waste treatment/storage methods are summarized here. Information was obtained from various tritium-handling laboratories, fusion devices, and CANDU plants. 38 refs., 1 fig

  12. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  13. Comparative analysis of methods and sources of financing of the transport organizations activity

    Science.gov (United States)

    Gorshkov, Roman

    2017-10-01

    The article considers the analysis of methods of financing of transport organizations in conditions of limited investment resources. A comparative analysis of these methods is carried out, the classification of investment, methods and sources of financial support for projects being implemented to date are presented. In order to select the optimal sources of financing for the projects, various methods of financial management and financial support for the activities of the transport organization were analyzed, which were considered from the perspective of analysis of advantages and limitations. The result of the study is recommendations on the selection of optimal sources and methods of financing of transport organizations.

  14. The composite sequential clustering technique for analysis of multispectral scanner data

    Science.gov (United States)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  15. All-Source Information Acquisition and Analysis in the IAEA Department of Safeguards

    International Nuclear Information System (INIS)

    Ferguson, Matthew; Norman, Claude

    2010-01-01

    All source information analysis enables proactive implementation of in-field verification activities, supports the State Evaluation process, and is essential to the IAEA's strengthened safeguards system. Information sources include State-declared nuclear material accounting and facility design information; voluntarily supplied information such as nuclear procurement data; commercial satellite imagery; open source information and information/results from design information verifications (DIVs), inspections and complementary accesses (CAs). The analysis of disparate information sources directly supports inspections, design information verifications and complementary access, and enables both more reliable cross-examination for consistency and completeness as well as in-depth investigation of possible safeguards compliance issues. Comparison of State-declared information against information on illicit nuclear procurement networks, possible trafficking in nuclear materials, and scientific and technical information on nuclear-related research and development programmes, provides complementary measures for monitoring nuclear developments and increases Agency capabilities to detect possible undeclared nuclear activities. Likewise, expert analysis of commercial satellite imagery plays a critical role for monitoring un-safeguarded sites and facilities. In sum, the combination of these measures provides early identification of possible undeclared nuclear material or activities, thus enhancing deterrence of safeguards system that is fully information driven, and increasing confidence in Safeguards conclusions. By increasing confidence that nuclear materials and technologies in States under Safeguards are used solely for peaceful purposes, information-driven safeguards will strengthen the nuclear non-proliferation system. Key assets for Agency collection, processing, expert analysis, and integration of these information sources are the Information Collection and Analysis

  16. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  17. Applying inversion techniques to derive source currents and geoelectric fields for geomagnetically induced current calculations

    Directory of Open Access Journals (Sweden)

    J. S. de Villiers

    2014-10-01

    Full Text Available This research focuses on the inversion of geomagnetic variation field measurement to obtain source currents in the ionosphere. During a geomagnetic disturbance, the ionospheric currents create magnetic field variations that induce geoelectric fields, which drive geomagnetically induced currents (GIC in power systems. These GIC may disturb the operation of power systems and cause damage to grounded power transformers. The geoelectric fields at any location of interest can be determined from the source currents in the ionosphere through a solution of the forward problem. Line currents running east–west along given surface position are postulated to exist at a certain height above the Earth's surface. This physical arrangement results in the fields on the ground having the magnetic north and down components, and the electric east component. Ionospheric currents are modelled by inverting Fourier integrals (over the wavenumber of elementary geomagnetic fields using the Levenberg–Marquardt technique. The output parameters of the inversion model are the current strength, height and surface position of the ionospheric current system. A ground conductivity structure with five layers from Quebec, Canada, based on the Layered-Earth model is used to obtain the complex skin depth at a given angular frequency. This paper presents preliminary and inversion results based on these structures and simulated geomagnetic fields. The results show some interesting features in the frequency domain. Model parameters obtained through inversion are within 2% of simulated values. This technique has applications for modelling the currents of electrojets at the equator and auroral regions, as well as currents in the magnetosphere.

  18. Modular Open-Source Software for Item Factor Analysis

    Science.gov (United States)

    Pritikin, Joshua N.; Hunter, Micheal D.; Boker, Steven M.

    2015-01-01

    This article introduces an item factor analysis (IFA) module for "OpenMx," a free, open-source, and modular statistical modeling package that runs within the R programming environment on GNU/Linux, Mac OS X, and Microsoft Windows. The IFA module offers a novel model specification language that is well suited to programmatic generation…

  19. Data Analysis with Open Source Tools

    CERN Document Server

    Janert, Philipp

    2010-01-01

    Collecting data is relatively easy, but turning raw information into something useful requires that you know how to extract precisely what you need. With this insightful book, intermediate to experienced programmers interested in data analysis will learn techniques for working with data in a business environment. You'll learn how to look at data to discover what it contains, how to capture those ideas in conceptual models, and then feed your understanding back into the organization through business plans, metrics dashboards, and other applications. Along the way, you'll experiment with conce

  20. Analysis of pulse-shape discrimination techniques for BC501A using GHz digital signal processing

    International Nuclear Information System (INIS)

    Rooney, B.D.; Dinwiddie, D.R.; Nelson, M.A.; Rawool-Sullivan, Mohini W.

    2001-01-01

    A comparison study of pulse-shape analysis techniques was conducted for a BC501A scintillator using digital signal processing (DSP). In this study, output signals from a preamplifier were input directly into a 1 GHz analog-to-digital converter. The digitized data obtained with this method was post-processed for both pulse-height and pulse-shape information. Several different analysis techniques were evaluated for neutron and gamma-ray pulse-shape discrimination. It was surprising that one of the simplest and fastest techniques resulted in some of the best pulse-shape discrimination results. This technique, referred to here as the Integral Ratio technique, was able to effectively process several thousand detector pulses per second. This paper presents the results and findings of this study for various pulse-shape analysis techniques with digitized detector signals.

  1. Techniques for long term conditioning and storage of radium sources

    International Nuclear Information System (INIS)

    Dogaru, Gheorghe; Dragolici, Felicia; Nicu, Mihaela

    2008-01-01

    The Horia Hulubei National Institute of Research and Development for Physics and Nuclear Engineering developed its own technology for conditioning the radium spent sealed radioactive sources. The laboratory dedicated to radiological characterization, identification of radium sources as well as the encapsulation of spent sealed radioactive sources was equipped with a local ventilation system, welding devices, tightness test devices as well as radiometric portable devices. Two types of capsules have been designed for conditioning of radium spent sealed radioactive sources. For these kinds of capsules different types of storage packaging were developed. Data on the radium inventory will be presented in the paper. The paper contains the description of the process of conditioning of spent sealed radioactive sources as well as the description of the capsules and packaging. The paper describes the equipment used for the conditioning of the radium spent sealed sources. (authors)

  2. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  3. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  4. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  5. Three-Dimensional X-Ray Diffraction Technique for Metals Science

    DEFF Research Database (Denmark)

    Zhang, Yubin; Fan, Guohua

    2017-01-01

    resolution can be micrometer scale and the measurement can be conducted within a reasonable time frame (a few hours). The 3DXRD microscope has originally been developed in cooperation between former Risø National Laboratory and the European Synchrotron Radiation Facility. Currently, this technique has been...... implemented in several large synchrotron facilities, e.g. the Advanced Photon Source (APS) in USA and the Spring-8 in Japan. Another family of 3DXRD technique that utilizes white beam synchrotron X-rays has also been developed in parallel in cooperation between Oak Ridge National Laboratory and APS...... analysis during tensile deformation, recrystallization growth kinetics, recrystallization nucleation, growth of individual recrystallized grain, grain growth after recrystallization, and local residual strain/stress analysis. The recent development of the 3DXRD technique and its potential use for materials...

  6. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  7. Cost-Effective Brillouin Optical Time-Domain Analysis Sensor Using a Single Optical Source and Passive Optical Filtering

    Directory of Open Access Journals (Sweden)

    H. Iribas

    2016-01-01

    Full Text Available We present a simplified configuration for distributed Brillouin optical time-domain analysis sensors that aims to reduce the cost of the sensor by reducing the number of components required for the generation of the two optical waves involved in the sensing process. The technique is based on obtaining the pump and probe waves by passive optical filtering of the spectral components generated in a single optical source that is driven by a pulsed RF signal. The optical source is a compact laser with integrated electroabsorption modulator and the optical filters are based on fiber Bragg gratings. Proof-of-concept experiments demonstrate 1 m spatial resolution over a 20 km sensing fiber with a 0.9 MHz precision in the measurement of the Brillouin frequency shift, a performance similar to that of much more complex setups. Furthermore, we discuss the factors limiting the sensor performance, which are basically related to residual spectral components in the filtering process.

  8. A microhistological technique for analysis of food habits of mycophagous rodents.

    Science.gov (United States)

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  9. Performance of a GaAs electron source

    International Nuclear Information System (INIS)

    Calabrese, R.; Ciullo, G.; Della Mea, G.; Egeni, G.P.; Guidi, V.; Lamanna, G.; Lenisa, P.; Maciga, B.; Rigato, V.; Rudello, V.; Tecchio, L.; Yang, B.; Zandolin, S.

    1994-01-01

    We discuss the performance improvement of a GaAs electron source. High quantum yield (14%) and constant current extraction (1 mA for more than four weeks) are achieved after a little initial decay. These parameters meet the requirements for application of the GaAs photocathode as a source for electron cooling devices. We also present the preliminary results of a surface analysis experiment, carried out by means of the RBS technique to check the hypothesis of cesium evaporation from the surface when the photocathode is in operation. (orig.)

  10. Critical Analysis on Open Source LMSs Using FCA

    Science.gov (United States)

    Sumangali, K.; Kumar, Ch. Aswani

    2013-01-01

    The objective of this paper is to apply Formal Concept Analysis (FCA) to identify the best open source Learning Management System (LMS) for an E-learning environment. FCA is a mathematical framework that represents knowledge derived from a formal context. In constructing the formal context, LMSs are treated as objects and their features as…

  11. Analysis on Dangerous Source of Large Safety Accident in Storage Tank Area

    Science.gov (United States)

    Wang, Tong; Li, Ying; Xie, Tiansheng; Liu, Yu; Zhu, Xueyuan

    2018-01-01

    The difference between a large safety accident and a general accident is that the consequences of a large safety accident are particularly serious. To study the tank area which factors directly or indirectly lead to the occurrence of large-sized safety accidents. According to the three kinds of hazard source theory and the consequence cause analysis of the super safety accident, this paper analyzes the dangerous source of the super safety accident in the tank area from four aspects, such as energy source, large-sized safety accident reason, management missing, environmental impact Based on the analysis of three kinds of hazard sources and environmental analysis to derive the main risk factors and the AHP evaluation model is established, and after rigorous and scientific calculation, the weights of the related factors in four kinds of risk factors and each type of risk factors are obtained. The result of analytic hierarchy process shows that management reasons is the most important one, and then the environmental factors and the direct cause and Energy source. It should be noted that although the direct cause is relatively low overall importance, the direct cause of Failure of emergency measures and Failure of prevention and control facilities in greater weight.

  12. Obsidian sources characterized by neutron-activation analysis.

    Science.gov (United States)

    Gordus, A A; Wright, G A; Griffin, J B

    1968-07-26

    Concentrations of elements such as manganese, scandium, lanthanum, rubidium, samarium, barium, and zirconium in obsidian samples from different flows show ranges of 1000 percent or more, whereas the variation in element content in obsidian samples from a single flow appears to be less than 40 percent. Neutron-activation analysis of these elements, as well as of sodium and iron, provides a means of identifying the geologic source of an archeological artifact of obsidian.

  13. Application of Thin Layer Activation Technique for Wear and Corrosion Studies in Stainless Steel Using Neutron Sources

    International Nuclear Information System (INIS)

    Mohamed, R.F.I.

    2013-01-01

    In this work elemental analysis for three types of stainless steel samples was performed to compare between their compositions. First the stainless samples were analyzed using Energy Dispersive X-ray (EDX) Spectrometer and Inductively Coupled Plasma Atomic Emission Spectrometry (ICP-AES) as conventional tools for elemental analysis. Second, the samples were subjected to detailed neutron activation analysis (NAA) using Pu-Be neutron source with applying γ-rays spectroscopic measurements for the irradiated samples. The first sample was in the form of thin foils. Eight radioactive isotopes were detected in the measured spectra namely 56 Mn, 59 Fe, 58 Co, 60 Co, 24 Na, 187 W, 99 Mo and 51 Cr which resulted from different neutron reactions with this sample. The other two samples were commercial and the NAA results for one of them show that all of the elements reported in the foil sample are the same except the absence of Mo and the presence of Cr.On the other hand the third sample shows a different composition where only Mn, Fe, and Ni were identified from the measured γ- ray spectra. Stacks of irradiated stainless steal foil and pellets were measured to obtain the activity as a function of thickness using the most intense gamma ray lines of the produced radionuclides. The obtained linear activity-thickness relations for the measured radionuclides were fitted to determine the slope and the maximum thickness which can be measured by this technique. The comparison between these curves showed that the most sensitive radioisotope for detecting slight changes in the thickness is 51 Cr which is formed through the 50 Cr (n,γ) 51 Cr reaction.

  14. Remote defect imaging for plate-like structures based on the scanning laser source technique

    Science.gov (United States)

    Hayashi, Takahiro; Maeda, Atsuya; Nakao, Shogo

    2018-04-01

    In defect imaging with a scanning laser source technique, the use of a fixed receiver realizes stable measurements of flexural waves generated by laser at multiple rastering points. This study discussed the defect imaging by remote measurements using a laser Doppler vibrometer as a receiver. Narrow-band burst waves were generated by modulating laser pulse trains of a fiber laser to enhance signal to noise ratio in frequency domain. Averaging three images obtained at three different frequencies suppressed spurious distributions due to resonance. The experimental system equipped with these newly-devised means enabled us to visualize defects and adhesive objects in plate-like structures such as a plate with complex geometries and a branch pipe.

  15. Experimental analysis of crack evolution in concrete by the acoustic emission technique

    Directory of Open Access Journals (Sweden)

    J. Saliba

    2015-10-01

    Full Text Available The fracture process zone (FPZ was investigated on unnotched and notched beams with different notch depths. Three point bending tests were realized on plain concrete under crack mouth opening displacement (CMOD control. Crack growth was monitored by applying the acoustic emission (AE technique. In order to improve our understanding of the FPZ, the width and length of the FPZ were followed based on the AE source locations maps and several AE parameters were studied during the entire loading process. The bvalue analysis, defined as the log-linear slope of the frequency-magnitude distribution of acoustic emissions, was also carried out to describe quantitatively the influence of the relative notch depth on the fracture process. The results show that the number of AE hits increased with the decrease of the relative notch depth and an important AE energy dissipation was observed at the crack initiation in unnotched beams. In addition, the relative notch depth influenced the AE characteristics, the process of crack propagation, and the brittleness of concrete.

  16. Analysis of 3-panel and 4-panel microscale ionization sources

    International Nuclear Information System (INIS)

    Natarajan, Srividya; Parker, Charles B.; Glass, Jeffrey T.; Piascik, Jeffrey R.; Gilchrist, Kristin H.; Stoner, Brian R.

    2010-01-01

    Two designs of a microscale electron ionization (EI) source are analyzed herein: a 3-panel design and a 4-panel design. Devices were fabricated using microelectromechanical systems technology. Field emission from carbon nanotube provided the electrons for the EI source. Ion currents were measured for helium, nitrogen, and xenon at pressures ranging from 10 -4 to 0.1 Torr. A comparison of the performance of both designs is presented. The 4-panel microion source showed a 10x improvement in performance compared to the 3-panel device. An analysis of the various factors affecting the performance of the microion sources is also presented. SIMION, an electron and ion optics software, was coupled with experimental measurements to analyze the ion current results. The electron current contributing to ionization and the ion collection efficiency are believed to be the primary factors responsible for the higher efficiency of the 4-panel microion source. Other improvements in device design that could lead to higher ion source efficiency in the future are also discussed. These microscale ion sources are expected to find application as stand alone ion sources as well as in miniature mass spectrometers.

  17. Reconstruction of reflectance data using an interpolation technique.

    Science.gov (United States)

    Abed, Farhad Moghareh; Amirshahi, Seyed Hossein; Abed, Mohammad Reza Moghareh

    2009-03-01

    A linear interpolation method is applied for reconstruction of reflectance spectra of Munsell as well as ColorChecker SG color chips from the corresponding colorimetric values under a given set of viewing conditions. Hence, different types of lookup tables (LUTs) have been created to connect the colorimetric and spectrophotometeric data as the source and destination spaces in this approach. To optimize the algorithm, different color spaces and light sources have been used to build different types of LUTs. The effects of applied color datasets as well as employed color spaces are investigated. Results of recovery are evaluated by the mean and the maximum color difference values under other sets of standard light sources. The mean and the maximum values of root mean square (RMS) error between the reconstructed and the actual spectra are also calculated. Since the speed of reflectance reconstruction is a key point in the LUT algorithm, the processing time spent for interpolation of spectral data has also been measured for each model. Finally, the performance of the suggested interpolation technique is compared with that of the common principal component analysis method. According to the results, using the CIEXYZ tristimulus values as a source space shows priority over the CIELAB color space. Besides, the colorimetric position of a desired sample is a key point that indicates the success of the approach. In fact, because of the nature of the interpolation technique, the colorimetric position of the desired samples should be located inside the color gamut of available samples in the dataset. The resultant spectra that have been reconstructed by this technique show considerable improvement in terms of RMS error between the actual and the reconstructed reflectance spectra as well as CIELAB color differences under the other light source in comparison with those obtained from the standard PCA technique.

  18. Application of optimal estimation techniques to FFTF decay heat removal analysis

    International Nuclear Information System (INIS)

    Nutt, W.T.; Additon, S.L.; Parziale, E.A.

    1979-01-01

    The verification and adjustment of plant models for decay heat removal analysis using a mix of engineering judgment and formal techniques from control theory are discussed. The formal techniques facilitate dealing with typical test data which are noisy, redundant and do not measure all of the plant model state variables directly. Two pretest examples are presented. 5 refs

  19. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  20. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  1. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  2. Techniques to extract physical modes in model-independent analysis of rings

    International Nuclear Information System (INIS)

    Wang, C.-X.

    2004-01-01

    A basic goal of Model-Independent Analysis is to extract the physical modes underlying the beam histories collected at a large number of beam position monitors so that beam dynamics and machine properties can be deduced independent of specific machine models. Here we discuss techniques to achieve this goal, especially the Principal Component Analysis and the Independent Component Analysis.

  3. Analysis of Dual Rotating Rake Data from the NASA Glenn Advanced Noise Control Fan Duct with Artificial Sources

    Science.gov (United States)

    Dahl, Milo D.; Sutliff, Daniel L.

    2014-01-01

    The Rotating Rake mode measurement system was designed to measure acoustic duct modes generated by a fan stage. Initially, the mode amplitudes and phases were quantified from a single rake measurement at one axial location. To directly measure the modes propagating in both directions within a duct, a second rake was mounted to the rotating system with an offset in both the axial and the azimuthal directions. The rotating rake data analysis technique was then extended to include the data measured by the second rake. The analysis resulted in a set of circumferential mode levels at each of the two rake microphone locations. Radial basis functions were then least-squares fit to this data to obtain the radial mode amplitudes for the modes propagating in both directions within the duct. Validation experiments have been conducted using artificial acoustic sources. Results are shown for the measurement of the standing waves in the duct from sound generated by one and two acoustic sources that are separated into the component modes propagating in both directions within the duct. Measured reflection coefficients from the open end of the duct are compared to analytical predictions.

  4. A quasi-monochromatic X-rays source for art painting pigments investigation

    Energy Technology Data Exchange (ETDEWEB)

    Albertin, F.; Franconieri, A.; Gambaccini, M.; Petrucci, F.; Chiozzi, S. [University of Ferrara, Department of Physics and INFN, Ferrara (Italy); Moro, D. [University of Padova, Department of Physics, Padova (Italy); LNL - INFN, Legnaro, Padova (Italy)

    2009-08-15

    Monochromatic X-ray sources can be used for several applications, like in medicine or in studying our cultural heritage. We are investigating imaging systems based on a tuneable energy band X-ray source, to obtain an element mapping of painting layers using the K-edge technique. The narrow energy band beams are obtained with conventional X-ray source via Bragg diffraction on a mosaic crystal; such an analysis has been performed at different diffraction angles, tuning the energy to investigate spectra of interest from the artistic point of view, like zinc and copper. In this paper the characteristics of the system in terms of fluence rate are reported, and first results of this technique on canvas samples and painting are presented. (orig.)

  5. Dosimetric analysis of radiation sources for use dermatological lesions

    International Nuclear Information System (INIS)

    Tada, Ariane

    2010-01-01

    Skin lesions undergoing therapy with radiation sources may have different patterns of malignancy. Malignant lesions or cancer most commonly found in radiotherapy services are carcinomas. Radiation therapy in skin lesions is performed with low penetration beams and orthovoltage X-rays, electron beams and radioactive sources ( 192 Ir, 198 Au, e 90 Sr) arranged on a surface mold or in metal applicator. This study aims to analyze the therapeutic radiation dose profile produced by radiation sources used in skin lesions radiotherapy procedures . Experimental measurements for the analysis of dosimetric radiation sources were compared with calculations obtained from a computer system based on the Monte Carlo Method. Computational results had a good agreement with the experimental measurements. Experimental measurements and computational results by the MCNP4C code were both physically consistent as expected. These experimental measurements compared with calculations using the MCNP-4C code have been used to validate the calculations obtained by MCNP code and to provide a reliable medical application for each clinical case. (author)

  6. Review of sample preparation techniques for the analysis of pesticide residues in soil.

    Science.gov (United States)

    Tadeo, José L; Pérez, Rosa Ana; Albero, Beatriz; García-Valcárcel, Ana I; Sánchez-Brunete, Consuelo

    2012-01-01

    This paper reviews the sample preparation techniques used for the analysis of pesticides in soil. The present status and recent advances made during the last 5 years in these methods are discussed. The analysis of pesticide residues in soil requires the extraction of analytes from this matrix, followed by a cleanup procedure, when necessary, prior to their instrumental determination. The optimization of sample preparation is a very important part of the method development that can reduce the analysis time, the amount of solvent, and the size of samples. This review considers all aspects of sample preparation, including extraction and cleanup. Classical extraction techniques, such as shaking, Soxhlet, and ultrasonic-assisted extraction, and modern techniques like pressurized liquid extraction, microwave-assisted extraction, solid-phase microextraction and QuEChERS (Quick, Easy, Cheap, Effective, Rugged, and Safe) are reviewed. The different cleanup strategies applied for the purification of soil extracts are also discussed. In addition, the application of these techniques to environmental studies is considered.

  7. Comparative study of macrotexture analysis using X-ray diffraction and electron backscattered diffraction techniques

    International Nuclear Information System (INIS)

    Serna, Marilene Morelli

    2002-01-01

    The macrotexture is one of the main characteristics in metallic materials, which the physical properties depend on the crystallographic direction. The analysis of the macrotexture to middles of the decade of 80 was just accomplished by the techniques of Xray diffraction and neutrons diffraction. The possibility of the analysis of the macrotexture using, the technique of electron backscattering diffraction in the scanning electronic microscope, that allowed to correlate the measure of the orientation with its location in the micro structure, was a very welcome tool in the area of engineering of materials. In this work it was studied the theoretical aspects of the two techniques and it was used of both techniques for the analysis of the macrotexture of aluminum sheets 1050 and 3003 with intensity, measured through the texture index 'J', from 2.00 to 5.00. The results obtained by the two techniques were shown reasonably similar, being considered that the statistics of the data obtained by the technique of electron backscatter diffraction is much inferior to the obtained by the X-ray diffraction. (author)

  8. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  9. Large scale applicability of a Fully Adaptive Non-Intrusive Spectral Projection technique: Sensitivity and uncertainty analysis of a transient

    International Nuclear Information System (INIS)

    Perkó, Zoltán; Lathouwers, Danny; Kloosterman, Jan Leen; Hagen, Tim van der

    2014-01-01

    Highlights: • Grid and basis adaptive Polynomial Chaos techniques are presented for S and U analysis. • Dimensionality reduction and incremental polynomial order reduce computational costs. • An unprotected loss of flow transient is investigated in a Gas Cooled Fast Reactor. • S and U analysis is performed with MC and adaptive PC methods, for 42 input parameters. • PC accurately estimates means, variances, PDFs, sensitivities and uncertainties. - Abstract: Since the early years of reactor physics the most prominent sensitivity and uncertainty (S and U) analysis methods in the nuclear community have been adjoint based techniques. While these are very effective for pure neutronics problems due to the linearity of the transport equation, they become complicated when coupled non-linear systems are involved. With the continuous increase in computational power such complicated multi-physics problems are becoming progressively tractable, hence affordable and easily applicable S and U analysis tools also have to be developed in parallel. For reactor physics problems for which adjoint methods are prohibitive Polynomial Chaos (PC) techniques offer an attractive alternative to traditional random sampling based approaches. At TU Delft such PC methods have been studied for a number of years and this paper presents a large scale application of our Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm for performing the sensitivity and uncertainty analysis of a Gas Cooled Fast Reactor (GFR) Unprotected Loss Of Flow (ULOF) transient. The transient was simulated using the Cathare 2 code system and a fully detailed model of the GFR2400 reactor design that was investigated in the European FP7 GoFastR project. Several sources of uncertainty were taken into account amounting to an unusually high number of stochastic input parameters (42) and numerous output quantities were investigated. The results show consistently good performance of the applied adaptive PC

  10. Studying The Contamination Status And The Sources Of Nitrogen Compounds In Groundwater In Ho Chi Minh City Area Using The Isotope Hydrology Techniques

    International Nuclear Information System (INIS)

    Nguyen Kien Chinh; Le Danh Chuan; Nguyen Van Nhien; Huynh Long; Tran Bich Lien; Luong Thu Tra

    2013-01-01

    The obtained data on nitrate, ammonium and total nitrogen concentration of 100 groundwater samples collected from 3 main aquifers show that although the nitrate concentration is still lower than the authorized limit of this compound in groundwater but the concentration and, specially the distribution of nitrate in shallow aquifer (Pleistocene) shows the increasing tendency in pollution level while ammonium and also total nitrogen content exceeded the authorized limit of these compounds in groundwater. For deeper aquifers (Upper and Lower Pliocene) groundwater is less polluted by nitrogen compounds. Analysis data on isotopic composition δ 15 N and δ 18 O of nitrate of the collected groundwater samples in compiling with other environmental isotopes data as δ 2 H, δ 18 O of water and natural radioactive isotopes in groundwater ( 3 H and 14 C) show that nitrate in Pleistocene groundwater is derived from both sources, geogenic source such as organic matter buried in aquifer soil layers and anthropogenic source like fertilizers, manure and septic wastes with the dominance of anthropogenic source. At the same time, obtained isotopic data proved the geogenic source of nitrate in water of the deeper aquifers. Study results on infiltration rate and infiltration depth of fertilizers and water using tracer techniques in the zone specializing in legume cultivation of the study area show the possible infiltration into shallow groundwater of water and also fertilizers. The obtained results prove the need of better management of the use of fertilizers for cultivation activities in the study area and to apply the advanced cultural manners for minimizing amount of fertilizers used. At the same time to strengthen wastes management and treatment in whole study area, especially in the zones which intake rain water as a recharge source to shallow groundwater such as Cu Chi, Hoc Mon and also inner city districts. (author)

  11. Open Source Analysis in Support to Nonproliferation Monitoring and Verification Activities: Using the New Media to Derive Unknown New Information

    International Nuclear Information System (INIS)

    Pabian, F.; Renda, G.; Jungwirth, R.; Kim, L.; Wolfart, E.; Cojazzi, G.G.M.; )

    2015-01-01

    This paper will describe evolving techniques that leverage freely available open source social media venues, sometimes referred to as the ''New Media,'' together with geospatial tools and commercial satellite imagery (with its ever improving spatial, spectral, and temporal resolutions), to expand the existing nuclear non-proliferation knowledge base by way of a review of some recent exemplar cases. The application of such techniques can enhance more general data mining, as those techniques can be more directly tailored to IAEA Safeguards monitoring and other non-proliferation verification activities to improve the possibility of the remote detection of undeclared nuclear related facilities and/or activities. As part of what might be called the new ''Societal Verification'' regime, these techniques have enlisted either the passive or active involvement of interested parties (NGOs, academics, and even hobbyists) using open sources and collaboration networks together with previously highlighted geospatial visualization tools and techniques. This paper will show how new significant, and unprecedented, information discoveries have already been made (and published in open source) in the last four years, i.e., since the last IAEA Safeguards Symposium. With respect to the possibility of soliciting active participation (e.g., ''crowd-sourcing'') via social media, one can envision scenarios (one example from open source will be provided) whereby a previously unknown nuclear related facility could be identified or located through the online posting of reports, line drawings, and/or ground photographs. Nonetheless, these techniques should not be viewed as a panacea, as examples of both deception and human error will also be provided. This paper will highlight the use of these remote-means of discovery techniques, and how they have shed entirely new light on important nuclear non-proliferation relevant issues in

  12. Microlens Array Laser Transverse Shaping Technique for Photoemission Electron Source

    Energy Technology Data Exchange (ETDEWEB)

    Halavanau, A. [Northern Illinois Univ., DeKalb, IL (United States); Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Ha, G. [Argonne National Lab. (ANL), Argonne, IL (United States); Pohang Univ. of Science and Technology (POSTECH) (Korea, Republic of); Qiang, G. [Argonne National Lab. (ANL), Argonne, IL (United States); Tsinghua Univ., Beijing (China); Gai, W. [Argonne National Lab. (ANL), Argonne, IL (United States); Power, J. [Argonne National Lab. (ANL), Argonne, IL (United States); Piot, P. [Northern Illinois Univ., DeKalb, IL (United States); Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Wisniewski, E. [Argonne National Lab. (ANL), Argonne, IL (United States); Edstrom, D. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Ruan, J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Santucci, J. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-09-06

    A common issue encountered in photoemission electron sources used in electron accelerators is distortion of the laser spot due to non ideal conditions at all stages of the amplification. Such a laser spot at the cathode may produce asymmetric charged beams that will result in degradation of the beam quality due to space charge at early stages of acceleration and fail to optimally utilize the cathode surface. In this note we study the possibility of using microlens arrays to dramatically improve the transverse uniformity of the drive laser pulse on UV photocathodes at both Fermilab Accelerator Science \\& Technology (FAST) facility and Argonne Wakefield Accelerator (AWA). In particular, we discuss the experimental characterization of the homogeneity and periodic patterned formation at the photocathode. Finally, we compare the experimental results with the paraxial analysis, ray tracing and wavefront propagation software.

  13. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  14. Ion beam analysis techniques for the elemental fingerprinting of fine particle smoke from vegetation burning in NSW

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator based ion beam analysis (IBA) techniques, including PIXE, PIGME, RBS and PESA, have been used to analyse elemental compositions of airborne particles covering a 60,000 square kilometres area of Wollongong, Sydney and Newcastle. These IBA techniques provide elemental concentrations for over 20 different elements from hydrogen to lead, they include H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Zn, Br and Pb. The four ion beam techniques are performed simultaneously on the 3MV Van de Graaff accelerator at ANSTO and have been described in detail elsewhere. They are sufficiently sensitive to analyse for many of these elements to levels around 10 ng/m{sup 3} or less in about five minutes of accelerator running time per filter. This is more than adequate for aerosol analyses as most filters contain around 150 {mu}g/cm{sup 2} of material which corresponds to about 10{mu}g/m{sup 3} of fine particles in the atmosphere. For this work fine particles are those with diameters less than 2.5{mu}m. Fine particle data has been collected twice a week and analysed for each of the above elements by ANSTO since 1991 at more than 25 different sites throughout NSW. This large dataset set allows us to not only determine the composition of fine particles and to look for signature elements for particular sources but also to use multivariate statistics to define elemental source fingerprints and then to determine the percentage contributions of these fingerprints to the total fine particle mass in the atmosphere. This paper describes the application of these techniques to the study of domestic wood fires and vegetation burning in NSW over a two year period from 1992-93. It also presents, for the first time, fine particle data related to the January 1994 bushfires in NSW. 6 refs., 1 tab., 5 figs.

  15. Ion beam analysis techniques for the elemental fingerprinting of fine particle smoke from vegetation burning in NSW

    International Nuclear Information System (INIS)

    Cohen, D.

    1996-01-01

    Accelerator based ion beam analysis (IBA) techniques, including PIXE, PIGME, RBS and PESA, have been used to analyse elemental compositions of airborne particles covering a 60,000 square kilometres area of Wollongong, Sydney and Newcastle. These IBA techniques provide elemental concentrations for over 20 different elements from hydrogen to lead, they include H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Zn, Br and Pb. The four ion beam techniques are performed simultaneously on the 3MV Van de Graaff accelerator at ANSTO and have been described in detail elsewhere. They are sufficiently sensitive to analyse for many of these elements to levels around 10 ng/m 3 or less in about five minutes of accelerator running time per filter. This is more than adequate for aerosol analyses as most filters contain around 150 μg/cm 2 of material which corresponds to about 10μg/m 3 of fine particles in the atmosphere. For this work fine particles are those with diameters less than 2.5μm. Fine particle data has been collected twice a week and analysed for each of the above elements by ANSTO since 1991 at more than 25 different sites throughout NSW. This large dataset set allows us to not only determine the composition of fine particles and to look for signature elements for particular sources but also to use multivariate statistics to define elemental source fingerprints and then to determine the percentage contributions of these fingerprints to the total fine particle mass in the atmosphere. This paper describes the application of these techniques to the study of domestic wood fires and vegetation burning in NSW over a two year period from 1992-93. It also presents, for the first time, fine particle data related to the January 1994 bushfires in NSW. 6 refs., 1 tab., 5 figs

  16. Ion beam analysis techniques for the elemental fingerprinting of fine particle smoke from vegetation burning in NSW

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    Accelerator based ion beam analysis (IBA) techniques, including PIXE, PIGME, RBS and PESA, have been used to analyse elemental compositions of airborne particles covering a 60,000 square kilometres area of Wollongong, Sydney and Newcastle. These IBA techniques provide elemental concentrations for over 20 different elements from hydrogen to lead, they include H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Zn, Br and Pb. The four ion beam techniques are performed simultaneously on the 3MV Van de Graaff accelerator at ANSTO and have been described in detail elsewhere. They are sufficiently sensitive to analyse for many of these elements to levels around 10 ng/m{sup 3} or less in about five minutes of accelerator running time per filter. This is more than adequate for aerosol analyses as most filters contain around 150 {mu}g/cm{sup 2} of material which corresponds to about 10{mu}g/m{sup 3} of fine particles in the atmosphere. For this work fine particles are those with diameters less than 2.5{mu}m. Fine particle data has been collected twice a week and analysed for each of the above elements by ANSTO since 1991 at more than 25 different sites throughout NSW. This large dataset set allows us to not only determine the composition of fine particles and to look for signature elements for particular sources but also to use multivariate statistics to define elemental source fingerprints and then to determine the percentage contributions of these fingerprints to the total fine particle mass in the atmosphere. This paper describes the application of these techniques to the study of domestic wood fires and vegetation burning in NSW over a two year period from 1992-93. It also presents, for the first time, fine particle data related to the January 1994 bushfires in NSW. 6 refs., 1 tab., 5 figs.

  17. Blind source separation of ex-vivo aorta tissue multispectral images.

    Science.gov (United States)

    Galeano, July; Perez, Sandra; Montoya, Yonatan; Botina, Deivid; Garzón, Johnson

    2015-05-01

    Blind Source Separation methods (BSS) aim for the decomposition of a given signal in its main components or source signals. Those techniques have been widely used in the literature for the analysis of biomedical images, in order to extract the main components of an organ or tissue under study. The analysis of skin images for the extraction of melanin and hemoglobin is an example of the use of BSS. This paper presents a proof of concept of the use of source separation of ex-vivo aorta tissue multispectral Images. The images are acquired with an interference filter-based imaging system. The images are processed by means of two algorithms: Independent Components analysis and Non-negative Matrix Factorization. In both cases, it is possible to obtain maps that quantify the concentration of the main chromophores present in aortic tissue. Also, the algorithms allow for spectral absorbance of the main tissue components. Those spectral signatures were compared against the theoretical ones by using correlation coefficients. Those coefficients report values close to 0.9, which is a good estimator of the method's performance. Also, correlation coefficients lead to the identification of the concentration maps according to the evaluated chromophore. The results suggest that Multi/hyper-spectral systems together with image processing techniques is a potential tool for the analysis of cardiovascular tissue.

  18. Performance evaluation using bootstrapping DEA techniques: Evidence from industry ratio analysis

    OpenAIRE

    Halkos, George; Tzeremes, Nickolaos

    2010-01-01

    In Data Envelopment Analysis (DEA) context financial data/ ratios have been used in order to produce a unified measure of performance metric. However, several scholars have indicated that the inclusion of financial ratios create biased efficiency estimates with implications on firms’ and industries’ performance evaluation. There have been several DEA formulations and techniques dealing with this problem including sensitivity analysis, Prior-Ratio-Analysis and DEA/ output–input ratio analysis ...

  19. Underdetermined Blind Audio Source Separation Using Modal Decomposition

    Directory of Open Access Journals (Sweden)

    Abdeldjalil Aïssa-El-Bey

    2007-03-01

    Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.

  20. Underdetermined Blind Audio Source Separation Using Modal Decomposition

    Directory of Open Access Journals (Sweden)

    Aïssa-El-Bey Abdeldjalil

    2007-01-01

    Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.