WorldWideScience

Sample records for analysis techniques progress

  1. Progress of neutron induced prompt gamma analysis technique in 1988~2003

    Institute of Scientific and Technical Information of China (English)

    JING Shi-Wei; LIU Yu-Ren; CHI Yan-Tao; TIAN Yu-Bing; CAO Xi-Zheng; ZHAO Xin-Hui; REN Wan-Bin; LIU Lin-Mao

    2004-01-01

    This paper describes new development of the neutron induced prompt gamma-ray analysis (NIPGA) technology in 1988~2003. The pulse fast-thermal neutron activation analysis method, which utilized the inelastic re action and capture reaction jointly, was employed to measure the elemental contents more efficiently. Lifetime of the neutron generator was more than 10000h and the performance of detector and MCA reached a high level. At the same time, Monte Carlo library least-square method was used to solve the nonlinearity problem in the NIPGA.

  2. Progress in automation, robotics and measuring techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2015-01-01

    This book presents recent progresses in control, automation, robotics, and measuring techniques. It includes contributions of top experts in the fields, focused on both theory and industrial practice. The particular chapters present a deep analysis of a specific technical problem which is in general followed by a numerical analysis and simulation, and results of an implementation for the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be useful for both researchers working in the area of engineering sciences and for practitioners solving industrial problems.    .

  3. Progress in application of CFD techniques

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Computational Fluid Dynamics (CFD) is an important branch of fluid mechanics, and will continue to play great roles on the design of aerospace vehicles, explora- tion of new concept vehicles and new aerodynamic technology. This paper will present the progress of CFD from point of view of engineering application in recent years at CARDC, including the software integration, grid technique, speeding up of convergence, unsteady fluid computation,etc., and also give some engineering application examples of CFD at CARDC.

  4. Progress in diagnostic techniques for sc cavities

    International Nuclear Information System (INIS)

    While routinely achieved performance characteristics of superconducting cavities have now reached a level which makes them useful in large scale applications, achieving this level has come only through the knowledge gained by systematic studies of performance limiting phenomena. Despite the very real progress that has been made, the routine performance of superconducting cavities still falls far short of both the theoretical expectations and the performance of a few exception examples. It is the task of systematically applied diagnostic techniques to reveal additional information concerning the response of superconducting surfaces to applied RF fields. Here recent developments in diagnostic techniques are discussed. 18 references, 12 figures

  5. Recent progress in analysis for fission products

    International Nuclear Information System (INIS)

    A great deal of progress has been achieved in analysis of fission products during the 1980s. In situ analysis of fission products and direct assay of radiowaste packages have been developed to meet the needs of radiowaste treatment and disposal. Activation analysis and non-radiometric method have been used to measure long-lived fission product nuclides. Their sensitivity is superior to that of traditional radiochemical analysis. Some new work on the Cherenkov counting technique and rapid radiochemical analysis has been published. The progress is reviewed from the point of view of methodology

  6. Granulation techniques and technologies: recent progresses.

    Science.gov (United States)

    Shanmugam, Srinivasan

    2015-01-01

    Granulation, the process of particle enlargement by agglomeration technique, is one of the most significant unit operations in the production of pharmaceutical dosage forms, mostly tablets and capsules. Granulation process transforms fine powders into free-flowing, dust-free granules that are easy to compress. Nevertheless, granulation poses numerous challenges due to high quality requirement of the formed granules in terms of content uniformity and physicochemical properties such as granule size, bulk density, porosity, hardness, moisture, compressibility, etc. together with physical and chemical stability of the drug. Granulation process can be divided into two types: wet granulation that utilize a liquid in the process and dry granulation that requires no liquid. The type of process selection requires thorough knowledge of physicochemical properties of the drug, excipients, required flow and release properties, to name a few. Among currently available technologies, spray drying, roller compaction, high shear mixing, and fluid bed granulation are worth of note. Like any other scientific field, pharmaceutical granulation technology also continues to change, and arrival of novel and innovative technologies are inevitable. This review focuses on the recent progress in the granulation techniques and technologies such as pneumatic dry granulation, reverse wet granulation, steam granulation, moisture-activated dry granulation, thermal adhesion granulation, freeze granulation, and foamed binder or foam granulation. This review gives an overview of these with a short description about each development along with its significance and limitations. PMID:25901297

  7. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  8. New Techniques and Progress in Epilepsy Surgery.

    Science.gov (United States)

    McGovern, Robert A; Banks, Garrett P; McKhann, Guy M

    2016-07-01

    While open surgical resection for medically refractory epilepsy remains the gold standard in current neurosurgical practice, modern techniques have targeted areas for improvement over open surgical resection. This review focuses on how a variety of these new techniques are attempting to address these various limitations. Stereotactic electroencephalography offers the possibility of localizing deep epileptic foci, improving upon subdural grid placement which limits localization to neocortical regions. Laser interstitial thermal therapy (LITT) and stereotactic radiosurgery can minimally or non-invasively ablate specific regions of interest, with near real-time feedback for laser interstitial thermal therapy. Finally, neurostimulation offers the possibility of seizure reduction without needing to ablate or resect any tissue. However, because these techniques are still being evaluated in current practice, there are no evidence-based guidelines for their use, and more research is required to fully evaluate their proper role in the current management of medically refractory epilepsy. PMID:27181271

  9. The Progress on Laser Surface Modification Techniques of Titanium Alloy

    Institute of Scientific and Technical Information of China (English)

    LIANG Cheng; PAN Lin; Al Ding-fei; TAO Xi-qi; XIA Chun-huai; SONG Yan

    2004-01-01

    Titanium alloy is widely used in aviation, national defence, automobile, medicine and other fields because of their advantages in lower density, corrosion resistance, and fatigue resistance etc. As titanium alloy is higher friction coefficients, weak wear resistance, bad high temperature oxidation resistance and lower biocompatibility, its applications are restricted. Using laser surface modification techniques can significantly improve the surface properties of titanium alloy. a review is given for progress on laser surface modification techniques of titanium alloy in this paper.

  10. Progress of Novel Inkjet Technique for Inorganic Materials Preparation

    OpenAIRE

    WANG Zhuo,LI Yong-Xiang,YANG Qun-Bao

    2009-01-01

    In the last few years, inkjet technique has attracted great attention for its use as a form-free fabrication method for building 3-D structure of inorganic materials and combinatorial material chips. This article reviews the progress of inorganic materials prepared by inkjet technique, including the development of inkjet printing system, the properties of different ink, applications, challenges and the hot of recent research. The fabrication of conducting circuits and setup of an electromagne...

  11. DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Food scientists use standards and calibrations to relate the concentration of a compound of interest to the instrumental response. The techniques used include classical, single point, and inverse calibrations, as well as standard addition and internal standards. Several fundamental criteria -- sel...

  12. Research Progress of Immunological Technique in Analysis of Fingerprints' Contents%指纹中化学成分免疫分析技术热点研究

    Institute of Scientific and Technical Information of China (English)

    张婷; 杨瑞琴

    2015-01-01

    ABATRACT: Pores abounds in finger skin’s ridge, through which sweat is excreted and deposited on the surface of the skin. Sometimes, sweat contains special substances including drug and its metabolites. These substances in fingerprints can reflect a possibility of an individual’s ingesting drugs. In recent years, the importance of analyzing some excreted and deposited compounds in fingerprints has drawn more attention because these compounds may provide more significant information of an individual and his/her past behavior. Determination of fingerprints’ residues is helpful for criminal investigation and evidence identification. Many techniques such as FT-IR spectroscopy, infrared spectral imaging, Raman spectroscopy, Raman spectral imaging mass spectrometry, gas chromatography-mass spectrometry, high performance liquid chromatography, liquid chromatography-mass spectrometry, and immunological approach, have been widely used. Among these, the immunological approach can not only make latent fingerprints visualized but also deliver more accurate and sensitive biochemical information from fingerprints. In this article, the recent progress and application of immunological method for developing fingerprints are presented by focusing the determination on amino acid, ingested drug in fingerprints and aged latent fingerprints as well.%指纹上覆盖着一排排的汗孔,分泌有特殊化学成分,诸如药物及其代谢物等,则反映着人体的生活习惯和代谢规律。在案件侦查和司法鉴定中,可以通过分析指纹中化学成分对指纹遗留者的过往行为进行刻画。指纹中化学成分的检测可以采用多种分析技术,主要包括红外光谱及红外光谱成像技术、拉曼光谱及拉曼成像技术、质谱技术、气相色谱法、液相色谱法以及色谱-质谱联用分析技术、免疫分析技术等,其中免疫分析技术是通过制备抗体并与纳米材料相结合,这种功能化的免疫

  13. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  14. Progress in diagnostic techniques for SC [superconducting] cavities

    International Nuclear Information System (INIS)

    Despite the very real progress that has been made, the routine performance of superconducting cavities still falls far short of both the theoretical expectations and the performance of afew exceptional examples. It is the task of systematically applied diagnostic techniques to reveal additional information concerning the response of superconducting surfaces to applied RF fields. In this paper we will direct our attention to discussions of recent developments in diagnostic techniqeus, such as thermometry in superfluid helium, and scanning laser acoustic microscopy. 18 refs., 12 figs

  15. Digital Fourier analysis advanced techniques

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to advanced digital Fourier analysis for advanced undergraduate and graduate students. Assuming knowledge of the Fast Fourier Transform, this book covers advanced topics including the Hilbert transform, cepstrum analysis, and the two-dimensional Fourier transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Advanced Techniques" includes practice problems and thorough Appendices. As a central feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. The applet source code in Visual Basic is provided online, enabling advanced students to tweak and change the programs for more sophisticated results. A complete, intuitive guide, "Digital Fourier Analysis - Advanced Techniques" is an essential reference for students in science and engineering.

  16. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  17. [Research progress on techniques for artificial propagation of corals].

    Science.gov (United States)

    Wang, Shu-hong; Hong, Wen-ting; Chen, Ji-xin; Chen, Yun; Wang, Yi-lei; Zhang, Zi-ping; Weng, Zhao-hong; Xie, Yang-jie

    2015-09-01

    The natural coral reef resources degrade rapidly because of climate change, environmental pollution and exploitation of aquarium species. Artificial propagation is an effective way to facilitate the reduction of wild harvesting, reef restoration, preservation of biodiversity. This paper reviewed the technique and research progresses focused on coral artificial propagation. We compared the advantages and disadvantages of sexual reproduction and asexual reproduction as well as in situ and ex situ propagation. Moreover, we summarized the important roles of irradiation, flow rate, nutrients, feed and other factors in coral propagation within recirculating aquaculture system (RAS). Irradiation is the key to successful ex situ coral culture and different species show different needs of radiation intensity and light spectrum. Therefore, artificial lighting in RAS, as well as. power and maintenance costs, are very important for ex situ coral aquaculture. In addition, corals are very sensitive to NH4+, NO3-, NO2- as well as phosphate in RAS, and many physical, chemical and biological methods are acquired to maintain low nutrients condition. Although RAS has progressed a lot in terms of irradiation, flow rate and nutrient control, future studies also should focus on sexual reproduction, genetic modification and disease control. PMID:26785577

  18. Progress on acoustic techniques for LMFBR structural surveillance

    International Nuclear Information System (INIS)

    Acoustic techniques are being developed to monitor remotely the incipient events of various modes of failure. Topics have been selected from the development programme which are either of special importance or in which significant advances have been made recently. Ultrasonic inspection of stainless steel welds is difficult and one alternative approach which is being explored is to identify manufacturing defects during fabrication by monitoring the welding processes. Preliminary measurements are described of the acoustic events measured during deliberately defective welding tests in the laboratory and some initial analysis using pattern recognition techniques is described. The assessment of structural failures using probability analysis has emphasised the potential value of continuous monitoring during operation and this has led to the investigation into the use of vibrational analysis and acoustic emission as monitoring techniques. Mechanical failure from fatigue may be anticipated from measurement of vibrational modes and experience from PFR and from models have indicated the depth of detailed understanding required to achieve this. In the laboratory a vessel with an artificial defect has been pressurised to failure. Detection of the weak stress wave emissions was possible but difficult and the prospects for on-line monitoring are discussed. Ultrasonic technology for providing images of components immersed in the opaque sodium of LMFBRs is being developed. Images are cormed by the physical scanning of a target using transducers in a pulse-echo mode. Lead zirconate transducers have been developed which can be deployed during reactor shut-down. The first application will be to examine a limited area of the core of PFR. Handling the data from such an experiment involves developing methods for reading and storing the information from such ultrasonic echo. Such techniques have been tested in real time by simulation in a water model. Methods of enhancing the images to be

  19. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  20. Progress in realistic LOCA analysis

    International Nuclear Information System (INIS)

    In 1988 the USNRC revised the ECCS rule contained in Appendix K and Section 50.46 of 10 CFR Part 50, which governs the analysis of the Loss Of Coolant Accident (LOCA). The revised regulation allows the use of realistic computer models to calculate the loss of coolant accident. In addition, the new regulation allows the use of high probability estimates of peak cladding temperature (PCT), rather than upper bound estimates. Prior to this modification, the regulations were a prescriptive set of rules which defined what assumptions must be made about the plant initial conditions and how various physical processes should be modeled. The resulting analyses were highly conservative in their prediction of the performance of the ECCS, and placed tight constraints on core power distributions, ECCS set points and functional requirements, and surveillance and testing. These restrictions, if relaxed, will allow for additional economy, flexibility, and in some cases, improved reliability and safety as well. For example, additional economy and operating flexibility can be achieved by implementing several available core and fuel rod designs to increase fuel discharge burnup and reduce neutron flux on the reactor vessel. The benefits of application of best estimate methods to LOCA analyses have typically been associated with reductions in fuel costs, resulting from optimized fuel designs, or increased revenue from power upratings. Fuel cost savings are relatively easy to quantify, and have been estimated at several millions of dollars per cycle for an individual plant. Best estimate methods are also likely to contribute significantly to reductions in O and M costs, although these reductions are more difficult to quantify. Examples of O and M cost reductions are: 1) Delaying equipment replacement. With best estimate methods, LOCA is no longer a factor in limiting power levels for plants with high tube plugging levels or degraded safety injection systems. If other requirements for

  1. Recent progress in protein structure analysis by NMR spectroscopy

    International Nuclear Information System (INIS)

    In recent years, many NMR methodologies have been developed to improve the molecular weight limit of protein structure analysis. Sophisticated selective stable-isotope labeling techniques such as the SAIL method solved the issue of signal reduction due to increased correlation time and that of spectral overlapping. Residual dipolar coupling and paramagnetic relaxation enhancement enabled to obtain long-range distance restraints for structure determination. NMR analysis of intrinsically disordered protein revealed novel molecular recognition mode of protein called coupling folding and binding. NMR method also revealed intermediates in macromolecular binding processes. New data acquisition techniques such the projection spectroscopy and the non-linear sampling, which introduced signal processing techniques, were developed to reduce the data acquisition time and/or increase sensitivity. In this chapter, recent progress in protein structure analysis by NMR spectroscopy is summarized. (author)

  2. Key Techniques and Application Progress of Molecular Pharmacognosy

    Institute of Scientific and Technical Information of China (English)

    XIAO Xue-feng; HU Jing; XU Hai-yu; GAO Wen-yuan; ZHANG Tie-jun; LIU Chang-xiao

    2011-01-01

    At the boundary between pharmacognosy and molecular biology, molecular pharmacognosy has developed as a new borderline discipline. This paper reviews the methods, application, and prospect of molecular pharmacognosy. DNA marker is one of genetic markers and some molecular marker methods which have been successfully used for genetic diversity identification and new medicinal resources development. Recombinant DNA technology provides a powerful tool that enables scientists to engineer DNA sequences. Gene chip technique could be used in determination of gene expression profiles, analyses of polymorphisms, construction of genomic library, analysis of mapping, and sequencing by hybridization. Using the methods and theory of molecular biology and pharmacognosy, molecular pharmacognosy represents an extremely prospective branch of pharmacognosy and focuses on the study of systemic growth of medicinal plants, identification and evaluation of germplasm resources, plant metabolomics and production of active compounds. Furthermore, the great breakthrough of molecular pharmacognosy could be anticipated on DNA fingerprint analysis, cultivar improvement, DNA identification, and a global DNA barcoding system in the future.

  3. Progress in radiation protection techniques for workers in the nuclear industry

    International Nuclear Information System (INIS)

    The increasingly stringent safety requirements of workers and the general public in the face of occupational and in particular nuclear risks call for continual improvements in radiation protection techniques. The Institute of Protection and Nuclear Safety (IPSN), especially the Technical Protection Services belonging to the Protection Department, and also the various radiation protection services of the French Atomic Energy Commission's nuclear centres and Electricite de France (EDF) are carrying out substantial research and development programmes on the subject. For this reason, IPSN organized a specialists' meeting to take stock of the efforts being made and to try to identify what steps seem most promising or should have priority at the national level. The authors summarize the presentations and discussions on three topics: (1) Progress in the analysis of the mechanism of exposure of workers; (2) Progress achieved from the radiation protection standpoint in the field of facility design and instrumentation; and (3) Application of the optimization principle

  4. Progress of tritium measurement techniques and future prospects

    International Nuclear Information System (INIS)

    Research and development of technologies for the safe handling of high-level tritium are indispensable for realization of a thermonuclear fusion reactor, and tritium measurement techniques play an important role in this subject. More than 35 years have been spent for the studies in this field at the Hydrogen Isotope Research Center (HRC), University of Toyama. Nuclear fusion systems need new measurement techniques that work in the limited range of conditions with high tritium level, as well as at the environmental level, because nearly pure tritium is used as fuel particles in the fusion system. Therefore, new measurement techniques have been investigated so far at HRC, and some of them have already played a certain role in the research on tritium-material interactions, but they are not enough yet. Further studies on measurement techniques will be required to establish the ability for precise control of concentration, amount, and/or distribution of tritium under various conditions. (author)

  5. The latest progress of fission track analysis

    International Nuclear Information System (INIS)

    Fission track analysis as a new nuclear track technique is based on fission track annealing in mineral and is used for oil and gas exploration successfully. The west part of China is the main exploration for oil and gas. The oil and gas basins there experienced much more complicated thermal history and higher paleotemperature. In order to apply fission track analysis to these basins, following work was be carried out: 1. The decomposition of grain age distribution of zircon fission tracks. 2. Study on thermal history of Ordos basin using zircon fission track analysis. 3. The fission track study on the Qiang Tang basin in tibet

  6. Progressive Damage Analysis of Bonded Composite Joints

    Science.gov (United States)

    Leone, Frank A., Jr.; Girolamo, Donato; Davila, Carlos G.

    2012-01-01

    The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented durable redundant joint. Both designs involve honeycomb sandwich structures with carbon/epoxy facesheets joined using adhesively bonded doublers.Progressive damage modeling allows for the prediction of the initiation and evolution of damage within a structure. For structures that include multiple material systems, such as the joint designs under consideration, the number of potential failure mechanisms that must be accounted for drastically increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, intraply matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The bonded joints were modeled using highly parametric, explicitly solved finite element models, with damage modeling implemented via custom user-written subroutines. Each ply was discretely meshed using three-dimensional solid elements. Layers of cohesive elements were included between each ply to account for the possibility of delaminations and were used to model the adhesive layers forming the joint. Good correlation with experimental results was achieved both in terms of load-displacement history and the predicted failure mechanism(s).

  7. Recent progress in the transition radiation detector techniques

    Science.gov (United States)

    Yuan, L. C. L.

    1973-01-01

    A list of some of the major experimental achievements involving charged particles in the relativistic region are presented. With the emphasis mainly directed to the X-ray region, certain modes of application of the transition radiation for the identification and separation of relativistic charged particles are discussed. Some recent developments in detection techniques and improvements in detector performances are presented. Experiments were also carried out to detect the dynamic radiation, but no evidence of such an effect was observed.

  8. Investigation progress of imaging techniques monitoring stem cell therapy

    International Nuclear Information System (INIS)

    Recently stem cell therapy has showed potential clinical application in diabetes mellitus, cardiovascular diseases, malignant tumor and trauma. Efficient techniques of non-invasively monitoring stem cell transplants will accelerate the development of stem cell therapies. This paper briefly reviews the clinical practice of stem cell, in addition, makes a review of monitoring methods including magnetic resonance and radionuclide imaging which have been used in stem cell therapy. (authors)

  9. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  10. The progress in radiotherapy techniques and it's clinical implications

    International Nuclear Information System (INIS)

    Three modem radiotherapy techniques were introduced into clinical practice at the onset of the 21st century - stereotactic radiation therapy (SRT), proton therapy and carbon-ion radiotherapy. Our paper summarizes the basic principles of physics, as well as the technical reqirements and clinical indications for those techniques. SRT is applied for intracranial diseases (arteriovenous malformations, acoustic nerve neuromas, brain metastases, skull base tumors) and in such cases it is referred to as stereotactic radiosurgery (SRS). Techniques used during SRS include GammaKnife, CyberKnife and dedicated linacs. SRT can also be applied for extracranial disease (non-small cell lung cancer, lung metastases, spinal and perispinal tumors, primary liver tumors, breast cancer, pancreatic tumors, prostate cancer, head and neck tumors) and in such cases it is referred to as stereotactic body radiation therapy (SBRT). Eye melanomas, skull base and cervical spine chordomas and chordosarcomas, as well as childhood neoplasms, are considered to be the classic indications for proton therapy. Clinical trials are currently conducted to investigate the usefulness of proton beam in therapy of non-small cell lung cancer, prostate cancer, head and neck tumors, primary liver and oesophageal cancer Carbon-ion radiotherapy is presumed to be more advantageous than proton therapy because of its higher relative biological effectiveness (RBE) and possibility of real-time control of the irradiated volume under PET visualization. The basic indications for carbon-ion therapy are salivary glands neoplasms, selected types of soft tissue and bone sarcomas, skull base chordomas and chordosarcomas, paranasal sinus neoplasms, primary liver cancers and inoperable rectal adenocarcinoma recurrences. (authors)

  11. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  12. Modera techniques of trend analysis and interpolation

    Directory of Open Access Journals (Sweden)

    L. TORELLI

    1975-05-01

    Full Text Available This article contains a schematic exposition of t h e theoretical framework on which recent techniques of trend analysis and interpolation rest. I t is shown t h a t such techniques consist in the joint application of Analysis of the Variance and of Multivariate Distribution Analysis. The t h e o r y of Universal Kriging by G. Matheron is also discussed and reduced t o t h e above theories.

  13. Analysis of dynamic conflicts by techniques of artificial intelligence

    OpenAIRE

    Shinar, Josef

    1989-01-01

    Dynamic conflicts exhibit differentiel game characteristics and their analysis by any method which disregards this feature may be, by definition, futile. Unfortunately, realistic conflicts may have an intricate information structure and a complex hierarchy which don't fit in the classical differential game formulation. Moreover, in many cases even well formulated differential games are not solvable. In the recent years great progress has been made in artificial intelligence techniques, put in...

  14. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  15. Progress in computer vision and image analysis

    CERN Document Server

    Bunke, Horst; Sánchez, Gemma; Otazu, Xavier

    2009-01-01

    This book is a collection of scientific papers published during the last five years, showing a broad spectrum of actual research topics and techniques used to solve challenging problems in the areas of computer vision and image analysis. The book will appeal to researchers, technicians and graduate students. Sample Chapter(s). Chapter 1: An Appearance-Based Method for Parametric Video Registration (2,352 KB). Contents: An Appearance-Based Method for Parametric Video Registration (X Orriols et al.); Relevance of Multifractal Textures in Static Images (A Turiel); Potential Fields as an External

  16. Recent Progress in Synthesis Techniques of Microstrip Bandpass Filter

    Directory of Open Access Journals (Sweden)

    Navita Singh

    2012-03-01

    Full Text Available End-coupled resonator bandpass filters built in microstrip are investigated. The admittance inverter parameters of coupling gaps between resonant sections are deduced from experiment and bandpass filter design rules are developed. This allows easy filter synthesis from “prototype” low-pass designs. Design techniques which were formerly employed in the realization of waveguide and coaxial filters have been applied in the synthesis of strip-line filters having “maximally-flat” and Tchebycheff response characteristics. In this paper, Tchebycheff response characteristics considered for realizing the required circuit parameters in strip line and we would like to give a way to conceive, design bandpass filter for the X-bands and C-band at the frequencies 10.7GHz and 6.2 GHz respectively with three-pole end-coupled microstrip filters, whichdesigned filters for Radar and GSO satellites and which used the capacitive resonators and stepped impedance resonators for filter realization. Therefore, by extension, the RF/microwave applications can be referred to as communications, and other that explore the usage of frequency spectrums, some of these frequency spectrums are further divided into many frequency bands. The design and simulation are performed using 3D full wave electromagnetic simulator IE3D.

  17. Progress on a Rayleigh Scattering Mass Flux Measurement Technique

    Science.gov (United States)

    Mielke-Fagan, Amy F.; Clem, Michelle M.; Elam, Kristie A.; Hirt, Stefanie M.

    2010-01-01

    A Rayleigh scattering diagnostic has been developed to provide mass flux measurements in wind tunnel flows. Spectroscopic molecular Rayleigh scattering is an established flow diagnostic tool that has the ability to provide simultaneous density and velocity measurements in gaseous flows. Rayleigh scattered light from a focused 10 Watt continuous-wave laser beam is collected and fiber-optically transmitted to a solid Fabry-Perot etalon for spectral analysis. The circular interference pattern that contains the spectral information that is needed to determine the flow properties is imaged onto a CCD detector. Baseline measurements of density and velocity in the test section of the 15 cm x 15 cm Supersonic Wind Tunnel at NASA Glenn Research Center are presented as well as velocity measurements within a supersonic combustion ramjet engine isolator model installed in the tunnel test section.

  18. Event tree analysis using artificial intelligence techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs.

  19. Poof Analysis: A technique for Concept Formation

    OpenAIRE

    Bundy, Alan

    1985-01-01

    We report the discovery of an unexpected connection between the invention of the concept of uniform convergence and the occurs check in the unification algorithm. This discovery suggests the invention of further interesting concepts in analysis and a technique for automated concept formation. Part of this technique has been implemented.The discovery arose as part of an attempt to understand the role of proof analysis in mathematical reasoning, so as to incorporate it into a computer program. ...

  20. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  1. Progress of the technique of coal microwave desulfurization

    Institute of Scientific and Technical Information of China (English)

    Xiuxiang Tao; Ning Xu; Maohua Xie; Longfei Tang

    2014-01-01

    With the advantages of its fast speed, effective and moderate controllable conditions, desulfurization of coal by microwave has become research focus in the field of clean coal technology. Coal is a homogeneous mixture which consists of various components with different dielectric properties, so their abilities to absorb microwaves are different, and the sulfur-containing components are better absorbers of microwave, which makes them can be selectively heated and reacted under microwave irradiation. There still remain controversies on the principle of microwave desulfurization at present, thermal effects or non-thermal effects. The point of thermal effects of microwave is mainly base on its characters of rapidly and selectly heating. While, in view of non-thermal effect, direct interactions between the microwave electromagnetic field and sulfur containing components are proposed. It is a fundamental problem to determine the dielectric properties of coal and the sulfur-containing components to reveal the interaction of microwave and sulfur-containing compounds. However, the test of dielectric property of coal is affected by many factors, which makes it difficult to measure dielectric properties accurately. In order to achieve better desulfurization effect, the researchers employ methods of adding chemical additives such as acid, alkali, oxidant, reductant, or changing the reaction atmosphere, or combining with other methods such as magnetic separation, ultrasonic and microorganism. Researchers in this field have also put forward several processes, and have obtained a number of patents. Obscurity of microwave desulfurization mechanism, uncertainties in qualitative and quantitative analysis of sulfur-containing functional groups in coal, and the lack of special microwave equipment have limited further development of microwave desulfurization technology.

  2. Nuclear analysis techniques and environmental science

    International Nuclear Information System (INIS)

    The features, developing trends and some frontier topics of nuclear analysis techniques and their applications in environmental science were reviewed, including the study of chemical speciation and environmental toxicology, microanalysis and identification of atmospheric particle, nuclear analysis methodology with high accuracy and quality assurance of environmental monitoring, super-sensitive nuclear analysis method and addiction of toxicant with DNA, environmental specimen banking at nuclear analysis centre and biological environmental monitor, etc

  3. Severe accident analysis using dynamic accident progression event trees

    Science.gov (United States)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  4. Factors Influencing Progressive Failure Analysis Predictions for Laminated Composite Structure

    Science.gov (United States)

    Knight, Norman F., Jr.

    2008-01-01

    Progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model for use with a nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details are described in the present paper. Parametric studies for laminated composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented and to demonstrate their influence on progressive failure analysis predictions.

  5. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  6. Primary progressive aphasia : neuropsychological analysis and evolution

    OpenAIRE

    Maruta, Carolina Pires, 1985-

    2015-01-01

    Tese de doutoramento, Ciências Biomédicas (Neurociências), Universidade de Lisboa, Faculdade de Medicina, 2015 Frontotemporal lobar degeneration (FTLD) is the second leading cause of early-onset (< 65 years) dementia. Some of its forms may begin by isolated language deficits, which are known as Primary Progressive Aphasia (PPA). PPA is defined as the insidious onset and progressive loss of linguistic abilities in the absence of major deficits in other areas of cognition or in activities of...

  7. Analysis and comparation of animation techniques

    OpenAIRE

    Joštová, Barbora

    2015-01-01

    This thesis is focused on the analysis and comparison of animation techniques. In the theoretical part of the thesis I define key terms, the historical development and the basic principles of animation techniques. In the practical part I describe the comparison between classic and digital types of animation. Based on this research I chose the most suitable animations that are further used to verify my hypothesis. The provided hypothesis is the order of based on how demanding it is in terms of...

  8. Clustering Analysis within Text Classification Techniques

    OpenAIRE

    Madalina ZURINI; Catalin SBORA

    2011-01-01

    The paper represents a personal approach upon the main applications of classification which are presented in the area of knowledge based society by means of methods and techniques widely spread in the literature. Text classification is underlined in chapter two where the main techniques used are described, along with an integrated taxonomy. The transition is made through the concept of spatial representation. Having the elementary elements of geometry and the artificial intelligence analysis,...

  9. Gold analysis by the gamma absorption technique.

    Science.gov (United States)

    Kurtoglu, Arzu; Tugrul, A Beril

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement. PMID:12485656

  10. Progress Testing: Critical Analysis and Suggested Practices

    Science.gov (United States)

    Albanese, Mark; Case, Susan M.

    2016-01-01

    Educators have long lamented the tendency of students to engage in rote memorization in preparation for tests rather than engaging in deep learning where they attempt to gain meaning from their studies. Rote memorization driven by objective exams has been termed a steering effect. Progress testing (PT), in which a comprehensive examination…

  11. Systems analysis department annual progress report 1986

    International Nuclear Information System (INIS)

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1986. The activities may be classified as energy systems analysis and risk and reliability analysis. The report includes a list of staff members. (author)

  12. Progress in phototaxis mechanism research and micromanipulation techniques of algae cells

    Institute of Scientific and Technical Information of China (English)

    WEN Chenglu; LI Heng; WANG Pengbo; LI Wei; ZHAO Jingquan

    2007-01-01

    Phototactic movement is a characteristic of some microorganisms' response to light environment. Most of the algae have dramatically phototactic responses, underlying the complicated biological, physical and photochemical mechanisms are involved. With the development of the micro/nano and sensor techniques, great progress has been made in the research of the algae phototaxis. This review article summarizes the progress made in the research on the functional phototactic structures, the mechanisms of photo-response process and photodynamics of phototaxis in algae, and describes the latest developed micro-tracking technique and micromanipulation technique.Moreover, based on our own research results, the potential correlation between the phototaxis and photosynthesis is discussed, and the directions for future research of the phototactic mechanism are proposed.

  13. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  14. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  15. Risk Analysis Group annual progress report 1984

    International Nuclear Information System (INIS)

    The activities of the Risk Analysis Group at Risoe during 1984 are presented. These include descriptions in some detail of work on general development topics and risk analysis performed as contractor. (author)

  16. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  17. Elemental Analysis of Shells by Nuclear Technique

    International Nuclear Information System (INIS)

    Quantitative analysis of strontium(Sr) and calcium(Ca) in fresh water shell and sea shell was studied by X-ray Fluorescence (XRF) technique with Emission-Transmission(E-T) method, using isotope X-ray sources of plutonium-238(Pu-238) and americium-241(Am-241), and comparing with Neutron Activation Analysis technique in TRR-1/M1 reactor. The results show that the calcium content in both types of shells are almost the same, but strontium in sea shell is 3-4 times higher than that in fresh water shell. Moreover, the results can verify the region that used to be the river or ocean. The high ratio of strontium to calcium in many types of the shells from Wat Jaedeehoi, Patumthanee province show the specific character of sea shell.So it can be concluded that this region used to be the ocean in the past

  18. CONSUMER BEHAVIOR ANALYSIS BY GRAPH MINING TECHNIQUE

    OpenAIRE

    KATSUTOSHI YADA; HIROSHI MOTODA; TAKASHI WASHIO; ASUKA MIYAWAKI

    2006-01-01

    In this paper, we discuss how graph mining system is applied to sales transaction data so as to understand consumer behavior. First, existing research of consumer behavior analysis for sequential purchase pattern is reviewed. Then we propose to represent the complicated customer purchase behavior by a directed graph retaining temporal information in a purchase sequence and apply a graph mining technique to analyze the frequent occurring patterns. In this paper, we demonstrate through the case...

  19. Biomechanical Analysis of Contemporary Throwing Technique Theory

    OpenAIRE

    Chen Jian

    2015-01-01

    Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting bas...

  20. The application of TXRF analysis technique

    International Nuclear Information System (INIS)

    The total reflection X-ray fluorescence (TXRF) analysis technique is introduced briefly. A small TXRF analyzer characterised by double path X-ray exciting sources and specially short optical path (15 cm) is described. Low minimum detection limit (MDL), e.g. 7 pg for Co element under Cu target tube operating at 20 kV and 6 mA, and 30 pg for Sr under Mo target tube at 46 kV and 10 mA, is achieved. Some analysis experiments for tap water, marine animal and human hair are performed and the results are given

  1. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  2. Analysis of breast cancer progression using principal component analysis and clustering

    Indian Academy of Sciences (India)

    G Alexe; G S Dalgin; S Ganesan; C DeLisi; G Bhanot

    2007-08-01

    We develop a new technique to analyse microarray data which uses a combination of principal components analysis and consensus ensemble -clustering to find robust clusters and gene markers in the data. We apply our method to a public microarray breast cancer dataset which has expression levels of genes in normal samples as well as in three pathological stages of disease; namely, atypical ductal hyperplasia or ADH, ductal carcinoma in situ or DCIS and invasive ductal carcinoma or IDC. Our method averages over clustering techniques and data perturbation to find stable, robust clusters and gene markers. We identify the clusters and their pathways with distinct subtypes of breast cancer (Luminal, Basal and Her2+). We confirm that the cancer phenotype develops early (in early hyperplasia or ADH stage) and find from our analysis that each subtype progresses from ADH to DCIS to IDC along its own specific pathway, as if each was a distinct disease.

  3. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  4. Systems Analysis Department annual progress report 1998

    DEFF Research Database (Denmark)

    1999-01-01

    The report describes the work of the Systems Analysis Department at Risø National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, IndustrialSafety and Reliability, Man/Machine Interac......The report describes the work of the Systems Analysis Department at Risø National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, IndustrialSafety and Reliability, Man...

  5. Important progress on the use of isotope techniques and methods in catchment hydrology

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The use of isotope techniques and methods in catchment hydrology in the last 50 years has generated two major types of progress: (1) Assessment of the temporal variations of the major stocks and flows of water in catchments, from which the estimation of wa-ter residence times is introduced in this paper. (2) Assessment of catchment hydrologic processes, in which the interactions be-tween different waters, hydrographical separation, and bio-geochemical process are described by using isotopes tracers. Future progress on isotope techniques and methods in hydrology is toward the understanding of the hydrological process in large river basins. Much potential also waits realization in terms of how isotope information may be used to calibrate and test distributed rainfall-runoff models and regarding aid in the quantification of sustainable water resources management.

  6. Progress in nuclear measuring and experimental techniques by application of microelectronics. 1

    International Nuclear Information System (INIS)

    In the past decade considerable progress has been made in nuclear measuring and experimental techniques by developing position-sensitive detector systems and widely using integrated circuits and microcomputers for data acquisition and processing as well as for automation of measuring processes. In this report which will be published in three parts those developments are reviewed and demonstrated on selected examples. After briefly characterizing microelectronics, the use of microelectronic elements for radiation detectors is reviewed. (author)

  7. Progress of nuclide tracing technique in the study of soil erosion in recent decade

    International Nuclear Information System (INIS)

    In the last decade nuclide tracing technique has been widely employed in the investigation of soil erosion, which makes the studies of soil erosion into a new and rapid development period. This paper tried to review the recent progress of using 137Cs, 210Pbex, 7Be, composite tracers and REE-INAA in soil erosion rate, sedimentation rate, sediment source and soil erosion processes study, and also the existing research results. The trends for future development and questions are also discussed. (authors)

  8. Systems Analysis Department annual progress report 1999

    DEFF Research Database (Denmark)

    2000-01-01

    This report describes the work of the Systems Analysis Department at Risø National Laboratory during 1999. The department is undertaking research within Energy Systems Analysis, Energy, Environment and Development Planning - UNEP Centre, Safety,Realiability and Human Factors, and Technology...

  9. Systems Analysis Department annual progress report 1998

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    1999-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability, Man/Machine Interaction and Technology Scenarios. The report includes lists of publications, lectures, committees and staff members. (au) 111 refs.

  10. Systems Analysis Department. Annual Progress Report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    2000-03-01

    This report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1999. The department is undertaking research within Energy Systems Analysis, Energy, Environment and Development Planning-UNEP Centre, Safety, Reliability and Human Factors, and Technology Scenarios. The report includes summary statistics and lists of publications, committees and staff members. (au)

  11. Systems Analysis department. Annual progress report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Petersen, Kurt E.

    1998-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1997. The department is undertaking research within Energy systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability and Man/Machine Interaction. The report includes lists of publications lectures, committees and staff members. (au) 110 refs.

  12. Progress report on the AMT analysis

    International Nuclear Information System (INIS)

    ICF Resources Incorporated's analysis of the Alternative Minimum Tax (AMT) has examined its effect on the US oil and gas industry from several different perspectives, to estimate the effect of the three relief proposals and to better understand the source of the outcry about the AMTs ''inequities.'' This report is a brief summary of the methodology and results to date. The complexity of the accounting mechanisms that comprise the AMT and the disparity between this analytical conclusions and clauses made by the oil and gas industry (principally the IPAA) have led this analysis through several distinct phases of: Project-level analysis; firm-level analysis; and demographic analysis. These analyses are described in detail

  13. VLF radio propagation conditions. Computational analysis techniques

    International Nuclear Information System (INIS)

    Complete text of publication follows. Very low frequency (VLF) radio waves propagate within the Earth-ionosphere waveguide with very little attenuation. Modifications of the waveguide geometry effect the propagation conditions, and hence, the attenuation. Changes in the ionosphere, such as the presence of the D-region during the day, or the precipitation of energetic particles, are the main causes of this modification. Using narrowband receivers monitoring VLF transmitters, the amplitude and phase of these signals are recorded. Multivariate data analysis techniques, namely Principal Component Analysis (PCA) and Singular Spectrum Analysis (SSA), are applied to the data in order to determine parameters, such as seasonal and diurnal changes, affecting the variation of these signals. Transient effects may then be easier to detect.

  14. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  15. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  16. Forensic Analysis using Geological and Geochemical Techniques

    Science.gov (United States)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  17. Progress in ALCHEMI analysis of crystal structure

    International Nuclear Information System (INIS)

    The atomic location by channeling-enhanced microanalysis (ALCHEMI) is an effective technique to clarify the atomic configuration in multi-component compounds. Recent development of the theory on the characteristic x-ray emission has made ALCHEMI a more reliable and expansive technique. On this revised ALCHEMI, the characteristic x-ray intensities are measured at various electron-incidence directions in a transmission electron microscope, and are compared with x-ray intensities calculated from dynamical electron diffraction and inelastic scattering theories. On the present work, this technique was applied to thermoelectric materials. The occupation probabilities of Mn atoms on Fe I and Fe II sites in a thermoelectric semiconductor Fe0.97Mn0.03Si2 of a β-FeSi2 structure were 0.434 and 0.574, respectively. As another example, the occupancy of Ce atoms on voids and the coordinates (z1, z2) of Sb atoms in Ce0.5Fe3NiSb12 of a skutterrudite CoSb3 structure was determined to be 0.33 and (z1=0.336, z2=0.147), respectively. (Y.K.)

  18. Clustering Analysis within Text Classification Techniques

    Directory of Open Access Journals (Sweden)

    Madalina ZURINI

    2011-01-01

    Full Text Available The paper represents a personal approach upon the main applications of classification which are presented in the area of knowledge based society by means of methods and techniques widely spread in the literature. Text classification is underlined in chapter two where the main techniques used are described, along with an integrated taxonomy. The transition is made through the concept of spatial representation. Having the elementary elements of geometry and the artificial intelligence analysis, spatial representation models are presented. Using a parallel approach, spatial dimension is introduced in the process of classification. The main clustering methods are described in an aggregated taxonomy. For an example, spam and ham words are clustered and spatial represented, when the concepts of spam, ham and common and linkage word are presented and explained in the xOy space representation.

  19. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    Science.gov (United States)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  20. Systems Analysis Department. Annual progress report 1996

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, H.; Olsson, C.; Petersen, K.E. [eds.

    1997-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1996. The department is undertaking research within Simulation and Optimisation of Energy Systems, Energy and Environment in Developing Countries - UNEP Centre, Integrated Environmental and Risk Management and Man/Machine Interaction. The report includes lists of publications, lectures, committees and staff members. (au) 131 refs.

  1. Vibration analysis techniques for rotating machineries

    International Nuclear Information System (INIS)

    In this modem era, lives and standard of living are closely linked with machines. Today higher quality, reliability and operational safety in a machine in addition the demand for its long service life are expected. To meet these demands, one requires the knowledge of its dynamic behavior. This can be achieved by employing predictive maintenance strategy with regular condition inspection and early fault recognition in case of damage. Machine diagnosis by vibration analysis technique offers cost effective and reliable method of condition evaluation. The causes can be located and corrective measures can be planned long before severe, direct and consequential damage/breakdown can occur. Although this subject is multifarious, this paper provides some assistance in understanding the basics of vibration measurement, analysis and diagnosis of faults in rotating machinery. Machinery diagnosis through vibration measurement is an integral part of in-service inspection programme practiced in nuclear power plants and research reactors

  2. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  3. Dynamics and vibrations progress in nonlinear analysis

    CERN Document Server

    Kachapi, Seyed Habibollah Hashemi

    2014-01-01

    Dynamical and vibratory systems are basically an application of mathematics and applied sciences to the solution of real world problems. Before being able to solve real world problems, it is necessary to carefully study dynamical and vibratory systems and solve all available problems in case of linear and nonlinear equations using analytical and numerical methods. It is of great importance to study nonlinearity in dynamics and vibration; because almost all applied processes act nonlinearly, and on the other hand, nonlinear analysis of complex systems is one of the most important and complicated tasks, especially in engineering and applied sciences problems. There are probably a handful of books on nonlinear dynamics and vibrations analysis. Some of these books are written at a fundamental level that may not meet ambitious engineering program requirements. Others are specialized in certain fields of oscillatory systems, including modeling and simulations. In this book, we attempt to strike a balance between th...

  4. Progress on the CWU READI Analysis Center

    Science.gov (United States)

    Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C.

    2015-12-01

    Real-time GPS position streams are desirable for a variety of seismic monitoring and hazard mitigation applications. We report on progress in our development of a comprehensive real-time GPS-based seismic monitoring system for the Cascadia subduction zone. This system is based on 1 Hz point position estimates computed in the ITRF08 reference frame. Convergence from phase and range observables to point position estimates is accelerated using a Kalman filter based, on-line stream editor that produces independent estimations of carrier phase integer biases and other parameters. Positions are then estimated using a short-arc approach and algorithms from JPL's GIPSY-OASIS software with satellite clock and orbit products from the International GNSS Service (IGS). The resulting positions show typical RMS scatter of 2.5 cm in the horizontal and 5 cm in the vertical with latencies below 2 seconds. To facilitate the use of these point position streams for applications such as seismic monitoring, we broadcast real-time positions and covariances using custom-built aggregation-distribution software based on RabbitMQ messaging platform. This software is capable of buffering 24-hour streams for hundreds of stations and providing them through a REST-ful web interface. To demonstrate the power of this approach, we have developed a Java-based front-end that provides a real-time visual display of time-series, displacement vector fields, and map-view, contoured, peak ground displacement. This Java-based front-end is available for download through the PANGA website. We are currently analyzing 80 PBO and PANGA stations along the Cascadia margin and gearing up to process all 400+ real-time stations that are operating in the Pacific Northwest, many of which are currently telemetered in real-time to CWU. These will serve as milestones towards our over-arching goal of extending our processing to include all of the available real-time streams from the Pacific rim. In addition, we have

  5. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  6. Safeguards information analysis: Progress, challenges and solutions

    International Nuclear Information System (INIS)

    While the IAEA's authority to verify the correctness and completeness of a State's declarations under its comprehensive safeguards agreement derives from the agreement itself, it is only with the provisions for broader access to information and locations available under an additional protocol that the IAEA is able to draw the safeguards conclusion regarding the absence of undeclared nuclear material and activities in the State. Under the State level concept, all relevant information about a State's nuclear activities is assessed to obtain as complete a picture as possible of the State's current and planned nuclear programme. The array of sources for information evaluation is both broad and diverse. Mainly, it encompasses information provided by States, information obtained from open sources, commercial satellite imagery, and inspectors' measurements. All the information is checked for internal consistency and consistency with information gathered during inspections and visits in the field. The 'information driven' approach has required an expansion of knowledge, expertise, information and analytical/evaluation skills. As the IAEA's corporate knowledge in exploiting new types of information has increased, so too has its capability for detecting proliferation indicators. In all areas concerned (open source, satellite imagery, consistency and trend analysis, nuclear trade analysis, environmental sampling, as well as State-declared information) remarkable improvements have been made with regard to methodologies, tools, expertise and skills. Although each of these areas has proven invaluable for the detection of certain undeclared activities/material, it is obvious that the strength of these methods is in their integration. Further qualitative and quantitative improvements will require the acquisition of additional specialist expertise, knowledge and technologies. The combination of all (new) technologies will be important for enhancing the IAEA's ability to detect

  7. Organic analysis progress report FY 1997

    Energy Technology Data Exchange (ETDEWEB)

    Clauss, S.A.; Grant, K.E.; Hoopes, V.; Mong, G.M.; Steele, R.; Bellofatto, D.; Sharma, A.

    1998-04-01

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples.

  8. Organic analysis progress report FY 1997

    International Nuclear Information System (INIS)

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples

  9. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  10. Risk factors for progressive ischemic stroke A retrospective analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    BACKGROUND: Progressive ischemic stroke has higher fatality rate and disability rate than common cerebral infarction, thus it is very significant to investigate the early predicting factors related to the occurrence of progressive ischemic stroke, thc potential pathological mechanism and the risk factors of early intervention for preventing the occurrence of progressive ischemic stroke and ameliorating its outcome.OBJECTIVE: To analyze the possible related risk factors in patients with progressive ishcemic stroke, so as to provide reference for the prevention and treatment of progressive ishcemic stroke.DESIGN: A retrospective analysis.SETTING: Department of Neurology, General Hospital of Beijing Coal Mining Group.PARTICIPANTS: Totally 280 patients with progressive ischemic stroke were selected from the Department of Neurology, General Hospital of Beijing Coal Mining Group from March 2002 to June 2006, including 192 males and 88 females, with a mean age of (62±7) years old. They were all accorded with the diagnostic standards for cerebral infarction set by the Fourth National Academic Meeting for Cerebrovascular Disease in 1995, and confired by CT or MRI, admitted within 24 hours after attack, and the neurological defect progressed gradually or aggravated in gradients within 72 hours after attack, and the aggravation of neurological defect was defined as the neurological deficit score decreased by more than 2 points. Meanwhile,200 inpatients with non-progressive ischemic stroke (135 males and 65 females) were selected as the control group.METHODS: After admission, a univariate analysis of variance was conducted using the factors of blood pressure, history of diabetes mellitus, fever, leukocytosis, levels of blood lipids, fibrinogen, blood glucose and plasma homocysteine, cerebral arterial stenosis, and CT symptoms of early infarction, and the significant factors were involved in the multivariate non-conditional Logistic regression analysis.MAIN OUTCOME MEASURES

  11. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  12. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  13. Liver Ultrasound Image Analysis using Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Smriti Sahu, Maheedhar Dubey, Mohammad Imroze Khan

    2012-12-01

    Full Text Available Liver cancer is the sixth most common malignanttumour and the third most common cause ofcancer-related deaths worldwide. Chronic Liverdamage affects up to 20% of our population. It hasmany causes - viral infections (Hepatitis B and C,toxins, genetic, metabolic and autoimmune diseases.The rate of liver cancer in Australia has increasedfour-fold in the past 20 years. For detection andqualitative diagnosis of liver diseases, Ultrasound(US image is an easy-to-use and minimally invasiveimaging modality. Medical images are oftendeteriorated by noise due to various sources ofinterferences and other phenomena known asSpeckle noise. Therefore it is required to apply somedigital image processing techniques for smoothingor suppression of speckle noise in ultrasoundimages. This paper attempts to undertake the studythree types of the image enhancement techniquesincluding, Shock Filter, Contrast Limited AdaptiveHistogram Equalization (CLAHE and Spatialfilter. These smoothing techniques are comparedusing performance matrices Peak Signal to NoiseRatio (PSNR and Mean Square Error (MSE. Ithas been observed that the Spatial high pass filtergives the better performance than others for liverultrasound image analysis.

  14. Advanced analysis techniques for uranium assay

    International Nuclear Information System (INIS)

    Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples count rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.

  15. Performance Analysis of Acoustic Echo Cancellation Techniques

    Directory of Open Access Journals (Sweden)

    Rajeshwar Dass

    2014-07-01

    Full Text Available Mainly, the adaptive filters are implemented in time domain which works efficiently in most of the applications. But in many applications the impulse response becomes too large, which increases the complexity of the adaptive filter beyond a level where it can no longer be implemented efficiently in time domain. An example of where this can happen would be acoustic echo cancellation (AEC applications. So, there exists an alternative solution i.e. to implement the filters in frequency domain. AEC has so many applications in wide variety of problems in industrial operations, manufacturing and consumer products. Here in this paper, a comparative analysis of different acoustic echo cancellation techniques i.e. Frequency domain adaptive filter (FDAF, Least mean square (LMS, Normalized least mean square (NLMS &Sign error (SE is presented. The results are compared with different values of step sizes and the performance of these techniques is measured in terms of Error rate loss enhancement (ERLE, Mean square error (MSE& Peak signal to noise ratio (PSNR.

  16. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  17. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters (125I, 57Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  18. Development and application of the electrochemical etching technique. Annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    1980-08-01

    This annual progress report documents further advances in the development and application of electrochemical etching of polycarbonate foils (ECEPF) for fast, intermediate, and thermal neutron dosimetry as well as alpha particle dosimetry. The fast (> 1.1 MeV) and thermal neutron dosimetry techniques were applied to a thorough investigation of the neutron contamination inherent in and about the primary x-ray beam of several medical therapy electron accelerators. Because of the small size of ECEPF dosimeters in comparison to other neutron meters, they have an unusually low perturbation of the radiation field under measurement. Due to this small size and the increased sensitivity of the ECEPF dosimeter over current techniques of measuring neutrons in a high photon field, the fast neutron contamination in the primary x-ray beam of all the investigated accelerators was measured with precision and found to be greater than that suggested by the other, more common, neutron dosimetry methods.

  19. Development and application of the electrochemical etching technique. Annual progress report

    International Nuclear Information System (INIS)

    This annual progress report documents further advances in the development and application of electrochemical etching of polycarbonate foils (ECEPF) for fast, intermediate, and thermal neutron dosimetry as well as alpha particle dosimetry. The fast (> 1.1 MeV) and thermal neutron dosimetry techniques were applied to a thorough investigation of the neutron contamination inherent in and about the primary x-ray beam of several medical therapy electron accelerators. Because of the small size of ECEPF dosimeters in comparison to other neutron meters, they have an unusually low perturbation of the radiation field under measurement. Due to this small size and the increased sensitivity of the ECEPF dosimeter over current techniques of measuring neutrons in a high photon field, the fast neutron contamination in the primary x-ray beam of all the investigated accelerators was measured with precision and found to be greater than that suggested by the other, more common, neutron dosimetry methods

  20. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  1. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  2. Progress of evaluation techniques for electromagnetic and mechanical properties of high temperature composite superconductors

    International Nuclear Information System (INIS)

    Remarkable progresses in the development of high temperature superconductors (HTS) such as BSCCO-2223 tapes and YBCO coated conductors have been achieved in recent years, where very high engineering critical current densities (Je) were reached in long conductor length. It is however necessary to realize simultaneously high strain tolerance of Je, low AC losses and high mechanical strength in order to apply them for practical uses. In the first part of the present review, some critical techniques to improve microstructures for achieving total performance of BSSCO tapes as well as YBCO coated conductors are suggested. In the major part, the recent progress of evaluation techniques of mecahno-electromagnetic properties is introduced. The HTS's are typical composite material consisting of essentially five components. Here the analytical technique is proposed to make clear the mechanical properties based on the rule of mixture, while the quantitative experimental method to measure tensile properties is introduced. The critical current is very sensitive on strain. The strain dependency could be divided into two regions. In the reversible region, the critical current decreases monotonously for BSCCO tapes. On the other hand, YBCO coated conductors give a so-called Ekin's intrinsic behavior for the change of critical current, where a maximum of critical current appears during the process of increasing tensile strain. In order to understand fully the strain dependences of critical current, it is absolutely necessary to elucidate the strain state exerted on the superconducting component in the composite. Recently the direct measurements of local strain have been succeeded by means of diffraction techniques using neutron and synchrotron radiation. Their interesting results including a new science are reported in the present review. (author)

  3. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  4. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  5. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  6. Applications of remote sensing techniques to the assessment of dam safety: A progress report

    International Nuclear Information System (INIS)

    Remote sensing detection and data collection techniques, combined with data from image analyses, have become effective tools that can be used for rapid identification, interpretation and evaluation of the geological and environmental information required in some areas of performance analysis of hydraulic dams. Potential geological hazards to dams such as faults, landslides and liquefaction, regional crustal warping or tilting, stability of foundation materials, flooding and volcanic hazards are applications in which remote sensing may aid analysis. Details are presented of remote sensing techiques, optimal time of data acquisition, interpreting techniques, and application. Techniques include LANDSAT thematic mapper (TM), SPOT images, thermal infrared scanning, colour infrared photography, normal colour photography, panchromatic black and white, normal colour video, infrared video, airborne multi-spectral electronic imagery, airborne synthetic aperture radar, side scan sonar, and LIDAR (optical radar). 3 tabs

  7. Vector Autoregresive Techniques for Structural Analysis Vector Autoregresive Techniques for Structural Analysis

    Directory of Open Access Journals (Sweden)

    Paul L. Fackler

    1988-03-01

    Full Text Available Vector Autoregresive Techniques for Structural Analysis Vector Autoregressive (VAR] models which do not rely on a recursive model srtructure are discussed. Linkages to traditional dynamic simultaneous equations models are developed which emphasize the nature of the identifying restrictions that characterize VAR models. Explicit expressions for the Score and Informtion functions are derived and their role in model identification, estimation and hypothesis testing is discussed.

  8. Modern Analysis Techniques for Spectroscopic Binaries

    CERN Document Server

    Hensberge, H

    2006-01-01

    Techniques to extract information from spectra of unresolved multi-component systems are revised, with emphasis on recent developments and practical aspects. We review the cross-correlation techniques developed to deal with such spectra, discuss the determination of the broadening function and compare techniques to reconstruct component spectra. The recent results obtained by separating or disentangling the component spectra is summarized. An evaluation is made of possible indeterminacies and random and systematic errors in the component spectra.

  9. Analysis Methods for Progressive Damage of Composite Structures

    Science.gov (United States)

    Rose, Cheryl A.; Davila, Carlos G.; Leone, Frank A.

    2013-01-01

    This document provides an overview of recent accomplishments and lessons learned in the development of general progressive damage analysis methods for predicting the residual strength and life of composite structures. These developments are described within their State-of-the-Art (SoA) context and the associated technology barriers. The emphasis of the authors is on developing these analysis tools for application at the structural level. Hence, modeling of damage progression is undertaken at the mesoscale, where the plies of a laminate are represented as a homogenous orthotropic continuum. The aim of the present effort is establish the ranges of validity of available models, to identify technology barriers, and to establish the foundations of the future investigation efforts. Such are the necessary steps towards accurate and robust simulations that can replace some of the expensive and time-consuming "building block" tests that are currently required for the design and certification of aerospace structures.

  10. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  11. An Analysis of Pyramidal Image Fusion Techniques

    OpenAIRE

    Meek, T. R.

    1999-01-01

    This paper discusses the application of multiresolution image fusion techniques to synthetic aperture radar (SAR) and Landsat imagery. Results were acquired through the development and application of image fusion software to test images. The test images were fused using six image fusion techniques that are the combinations from three types of image decomposition algorithms (ratio of low pass [RoLP] pyramids, gradient pyramids, and morphological pyramids) and two types of fusion algorithms (se...

  12. Recent Progresses in Nanobiosensing for Food Safety Analysis.

    Science.gov (United States)

    Yang, Tao; Huang, Huifen; Zhu, Fang; Lin, Qinlu; Zhang, Lin; Liu, Junwen

    2016-01-01

    With increasing adulteration, food safety analysis has become an important research field. Nanomaterials-based biosensing holds great potential in designing highly sensitive and selective detection strategies necessary for food safety analysis. This review summarizes various function types of nanomaterials, the methods of functionalization of nanomaterials, and recent (2014-present) progress in the design and development of nanobiosensing for the detection of food contaminants including pathogens, toxins, pesticides, antibiotics, metal contaminants, and other analytes, which are sub-classified according to various recognition methods of each analyte. The existing shortcomings and future perspectives of the rapidly growing field of nanobiosensing addressing food safety issues are also discussed briefly. PMID:27447636

  13. Multiuser detection and independent component analysis-Progress and perspective

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The latest progress in the multiuser detection and independent component analysis (ICA) is reviewed systematically. Then two novel classes of multiuser detection methods based on ICA algorithms and feedforward neural networks are proposed. Theoretical analysis and computer simulation show that ICA algorithms are effective to detect multiuser signals in code-division multiple-access (CDMA) system. The performances of these methods are not identical entirely in various channels, but all of them are robust, efficient, fast and suitable for real-time implementations.

  14. Recent Progresses in Nanobiosensing for Food Safety Analysis

    Science.gov (United States)

    Yang, Tao; Huang, Huifen; Zhu, Fang; Lin, Qinlu; Zhang, Lin; Liu, Junwen

    2016-01-01

    With increasing adulteration, food safety analysis has become an important research field. Nanomaterials-based biosensing holds great potential in designing highly sensitive and selective detection strategies necessary for food safety analysis. This review summarizes various function types of nanomaterials, the methods of functionalization of nanomaterials, and recent (2014–present) progress in the design and development of nanobiosensing for the detection of food contaminants including pathogens, toxins, pesticides, antibiotics, metal contaminants, and other analytes, which are sub-classified according to various recognition methods of each analyte. The existing shortcomings and future perspectives of the rapidly growing field of nanobiosensing addressing food safety issues are also discussed briefly. PMID:27447636

  15. Cochlear implant simulator for surgical technique analysis

    Science.gov (United States)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  16. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 40K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  17. Gene expression analysis of relapsing– remitting, primary progressive and secondary progressive multiple sclerosis

    DEFF Research Database (Denmark)

    Ratzer, R; Søndergaard, Helle Bach; Christensen, Jeppe Romme; Börnsen, Lars Svend; Borup, Rasmus; Sørensen, Per Soelberg; Sellebjerg, Finn Thorup

    2013-01-01

    Previous studies of multiple sclerosis (MS) have indicated differences in the pathogenesis in relapsing-remitting (RRMS), secondary progressive (SPMS) and primary progressive (PPMS) disease.......Previous studies of multiple sclerosis (MS) have indicated differences in the pathogenesis in relapsing-remitting (RRMS), secondary progressive (SPMS) and primary progressive (PPMS) disease....

  18. Orthokeratology to control myopia progression: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Yuan Sun

    Full Text Available To evaluate the clinical treatment effects of orthokeratology to slow the progression of myopia.Several well-designed controlled studies have investigated the effects of orthokeratology in school-aged children. We conducted this meta-analysis to better evaluate the existing evidence. Relevant studies were identified in the Medline and Embase database without language limitations. The main outcomes included axial length and vitreous chamber depth reported as the mean ± standard deviation. The results were pooled and assessed with a fixed-effects model analysis. Subgroup analyses were performed according to geographical location and study design.Of the seven eligible studies, all reported axial length changes after 2 years, while two studies reported vitreous chamber depth changes. The pooled estimates indicated that change in axial length in the ortho-k group was 0.27 mm (95% confidence interval [CI]: 0.22, 0.32 less than the control group. Myopic progression was reduced by approximately 45%. The combined results revealed that the difference in vitreous chamber depth between the two groups was 0.22 mm (95% confidence interval [CI]: 0.14, 0.31. None of the studies reported severe adverse events.The overall findings suggest that ortho-k can slow myopia progression in school-aged children.

  19. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    Science.gov (United States)

    Mirizzi, F.

    2014-02-01

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper.

  20. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  1. Recent progress of surface analysis (AES, XPS, and TOF-SIMS) and their application to corrosion analysis

    International Nuclear Information System (INIS)

    Auger Electron Spectroscopy (AES), X-ray Photoelectron Spectroscopy (XPS), and Time-of-Flight Secondary Ion Mass Spectrometry (TOF-SIMS) are surface analysis techniques which provide atomic- and molecular-level surface chemical information. They are widely used for failure analysis, quality control, and research and development of advanced materials and devices. In this review, we overview the recent progress of the commercial apparatus, and also highlight their improved sensitivity and depth profiling capabilities. We also introduce their recent application in corrosion science. (author)

  2. Progressive Failure Analysis on the Single Lap Bonded Joints

    Directory of Open Access Journals (Sweden)

    Kadir TURAN

    2010-03-01

    Full Text Available In this study, the failure analysis on the single lap bonded joint, which is used for joined two composite plates each other with adhesive, is investigated experimentally and numerically. In the joint, the epoxy resin is used for adhesive and the four layered carbon fiber reinforced epoxy matrix resin composite plates are used for adherent. Numerical study is performed in the ANSYS software which is used finite element method for solution. For obtained numerical failure loads, the progressive failure analysis is used with material property degradation rules. In the failure analysis the Hashin Failure Criterion is used for composite plates and the Maximum Principal Stress failure criterion is used for adhesive. The effects of the adhesive thickness overlap lengths and plate weight on the joint strength is investigated with numerically. As a result it is seen that the failure loads is affected the bond face area. The results are presented with graphs and tables.

  3. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  4. Hybrid chemical and nondestructive-analysis technique

    Energy Technology Data Exchange (ETDEWEB)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities.

  5. Hybrid chemical and nondestructive analysis technique

    International Nuclear Information System (INIS)

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  6. Statistical Analysis Techniques for Small Sample Sizes

    Science.gov (United States)

    Navard, S. E.

    1984-01-01

    The small sample sizes problem which is encountered when dealing with analysis of space-flight data is examined. Because of such a amount of data available, careful analyses are essential to extract the maximum amount of information with acceptable accuracy. Statistical analysis of small samples is described. The background material necessary for understanding statistical hypothesis testing is outlined and the various tests which can be done on small samples are explained. Emphasis is on the underlying assumptions of each test and on considerations needed to choose the most appropriate test for a given type of analysis.

  7. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  8. Progress Toward the Analysis of the Kinetic Stabilizer Concept

    Energy Technology Data Exchange (ETDEWEB)

    Post, R F; Byers, J A; Cohen, R H; Fowler, T K; Ryutov, D D; Tung, L S

    2005-02-08

    The Kinetic Stabilizer (K-S) concept [1] represents a means for stabilizing axisymmetric mirror and tandem-mirror (T-M) magnetic fusion systems against MHD interchange instability modes. Magnetic fusion research has given us examples of axisymmetric mirror confinement devices in which radial transport rates approach the classical ''Spitzer'' level, i.e. situations in which turbulence if present at all, is at too low a level to adversely affect the radial transport [2,3,4]. If such a low-turbulence condition could be achieved in a T-M system it could lead to a fusion power system that would be simpler, smaller, and easier to develop than one based on closed-field confinement, e.g., the tokamak, where the transport is known to be dominated by turbulence. However, since conventional axisymmetric mirror systems suffer from the MHD interchange instability, the key to exploiting this new opportunity is to find a practical way to stabilize this mode. The K-S represents one avenue to achieving this goal. The starting point for the K-S concept is a theoretical analysis by Ryutov [5]. He showed that a MHD-unstable plasma contained in an axisymmetric mirror cell can be MHD-stabilized by the presence of a low-density plasma on the expanding field lines outside the mirrors. If this plasma communicates well electrically with the plasma in the then this exterior plasma can stabilize the interior, confined, plasma. This stabilization technique was conclusively demonstrated in the Gas Dynamic Trap (GDT) experiment [6] at Novosibirsk, Russia, at mirror-cell plasma beta values of 40 percent. The GDT operates in a high collisionality regime. Thus the effluent plasma leaking through the mirrors, though much lower in density than that of the confined plasma, is still high enough to satisfy the stabilization criterion. This would not, however, be the case in a fusion T-M with axisymmetric plug and central cell fields. In such a case the effluent plasma would be far

  9. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  10. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  11. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  12. Methodological progresses in Markovian availability analysis and applications

    International Nuclear Information System (INIS)

    The Markovian model applied to reliability analysis is well known as an effective tool, whenever some dependencies affect the probabilistic behaviour of system's components. Its ability to study the dynamical evolution of systems allows to include human actions into the temporal evolution (inspections, maintenances, including human failure probabilities). The starting point has been the Sstagen-Mmarela code. In spite of the fact that this code already realizes much progresses towards reducing the size of markovian matrices (merging of Markov processes of systems exhibiting symmetries), there is still an imperative need to reduce memory requirements. This implies, as a first step of any realistic analysis, a modularization of the studied system into subsystems, which could be 'coupled'. The methodology is applied to the auxiliary feedwater injection of Doel 3. (orig./HSCH)

  13. Psychoanalytic technique and 'analysis terminable and interminable'.

    Science.gov (United States)

    Sandler, J

    1988-01-01

    Some of the implications for psychoanalytic technique of the papers given at the plenary sessions of the Montreal Congress are considered. Emphasis is placed on the role of affects in development and in current psychic functioning. Motivation for unconscious wishes arises from many sources, and affects should not only be thought of as drive derivatives. There is a substantial gap between the (largely) implicit clinico-technical theories in the analytic work presented, which do in fact show great sensitivity to the patients' affects, and the formal 'official' general psychoanalytic theory used. This discrepancy in our theories should be faced. Freud's tripartite structural theory of the mind (the 'second topography') seems now to have limitations for clinical purposes. PMID:3063676

  14. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  15. Accelerator based techniques for aerosol analysis

    International Nuclear Information System (INIS)

    At the 3 MV Tandetron accelerator of the LABEC laboratory of INFN (Florence, Italy) an external beam facility is fully dedicated to PIXE-PIGE measurements of elemental composition of atmospheric aerosols. Examples regarding recent monitoring campaigns, performed in urban and remote areas, both on a daily basis and with high time resolution, as well as with size selection, will be presented. It will be evidenced how PIXE can provide unique information in aerosol studies or can play a complementary role to traditional chemical analysis. Finally a short presentation of 14C analysis of the atmospheric aerosol by Accelerator Mass Spectrometry (AMS) for the evaluation of the contributions from either fossil fuel combustion or modern sources (wood burning, biogenic activity) will be given. (author)

  16. Comparison of Commonly Used Accident Analysis Techniques for Manufacturing Industries

    Directory of Open Access Journals (Sweden)

    IRAJ MOHAMMADFAM

    2015-10-01

    Full Text Available The adverse consequences of major accident events have led to development of accident analysis techniques to investigate thoroughly the accidents. However, each technique has its own advantages and shortcomings,which make it very difficult to find a single technique being capable of analyzing all types of accidents. Therefore, the comparison of accident analysis techniques would help finding out their capabilities in different circumstances to choose the most one. In this research, the techniques CBA and AABF were compared with Tripod β in order to determine the superior technique for analysis of major accidents in manufacturing industries. At first step, the comparison criteria were developed using Delphi Method. Afterwards, the relative importance of each criterion was qualitatively determined and the qualitative values were then converted to the quantitative values  applying  Fuzzy  triangular  numbers.  Finally,  the  TOPSIS  was  used  to  prioritize  the techniques in terms of the preset criteria. The results of the study showed that Tripod β is superior to the CBA and AABF. It is highly recommended to compare all available accident analysis techniques based on proper criteria in order to select the best one whereas improper choice of accident analysis techniques may lead to misguided results.

  17. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  18. ANALYSIS AND COMPARATIVE STUDY OF SEARCHING TECHNIQUES

    OpenAIRE

    Yuvraj Singh Chandrawat*

    2015-01-01

    We live in the age of technolgy and it is quiet obvious that it is increasing day-by-day endlessly. In this technical era researchers are focusing on the development of the existing technologies. Software engineering is the dominant branch of Computer Science that deals with the development and analysis of the software. The objective of this study is to analyze and compare the existing searching algorithms (linear search and binary search). In this paper, we will discuss both thes...

  19. DNA ANALYSIS OF RICIN USING RAPD TECHNIQUE

    OpenAIRE

    Martin Vivodík; Želmíra Balážová; Zdenka Gálová

    2014-01-01

    Castor (Ricinus communis L.) is an important plant for production of industrial oil. The systematic evaluation of the molecular diversity encompassed in castor inbreds or parental lines offers an efficient means of exploiting the heterosis in castor as well as for management of biodiversity. The aim of this work was to detect genetic variability among the set of 30 castor genotypes using 5 RAPD markers. Amplification of genomic DNA of 30 genotypes, using RAPD analysis, yielded 35 fragments, w...

  20. LIFECYCLE ANALYSIS AS THE CORPORATE ENVIRONMENTAL RESPONSIBILITY ASSESSMENT TECHNIQUE

    OpenAIRE

    Bojan Krstic, Milica Tasic, Vladimir Ivanovic

    2015-01-01

    Lifecycle analysis is one of the techniques for assessing the impact of enterprise on the environment, by monitoring environmental effects of the product along its lifecycle. Since the cycle can be seen in stages (extraction of raw materials, raw materials processing, final product production, product use and end of use of the product), the analysis can be applied to all or only some parts of the aforementioned cycle, hence the different variants of this technique. The analysis itself is defi...

  1. PROGRESS IN PROTEOME ANALYTICAL TECHNIQUES%蛋白质组分析技术进展

    Institute of Scientific and Technical Information of China (English)

    解建勋; 蒲小平; 李玉珍; 李长龄

    2001-01-01

    The proteome represents the protein pattern of a speciy,anorganism,a cell,an organelle,or even a body fluid determined quantitatively at a certain moment and under precisely defined limiting conditions.Proteome research techniques are important tools in the Post-Genome Era.Quantitative separation and analysis of proteins in the proteome involve many techniques,including sample preparation,two dimension(2D)gel electrophoresis, capillary electrophoresis,chromatographic techniques,mass spectrometry,and so on.The 2D gel electrophoresis is currently a quite good method which is available to provide enough space for several thousand components and separate protein mixtures with in a few hours.Combined use of various analysis techniques and automation in instrumentation will be the recent trend in this field.%蛋白质组是指某一物种、个体、器官、组织、细胞乃至体液在精确控制其环境条件之下,特定时刻的全部蛋白质表达图谱。继基因组之后,它的研究即将成为分子生物学的研究热点。蛋白质组研究中常用分离分析技术包括样品制备,双向凝胶电泳,毛细管电泳,色谱技术和质谱技术。双向凝胶电泳是在较短时间内分离大量蛋白质组分,提供足够分离空间的比较成熟的方法。各种分析技术的连用和分析过程的自动化将是蛋白质组研究技术的发展方向。

  2. Multivariate techniques of analysis for ToF-E recoil spectrometry data

    Energy Technology Data Exchange (ETDEWEB)

    Whitlow, H.J.; Bouanani, M.E.; Persson, L.; Hult, M.; Jonsson, P.; Johnston, P.N. [Lund Institute of Technology, Solvegatan, (Sweden), Department of Nuclear Physics; Andersson, M. [Uppsala Univ. (Sweden). Dept. of Organic Chemistry; Ostling, M.; Zaring, C. [Royal institute of Technology, Electrum, Kista, (Sweden), Department of Electronics; Johnston, P.N.; Bubb, I.F.; Walker, B.R.; Stannard, W.B. [Royal Melbourne Inst. of Tech., VIC (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Multivariate statistical methods are being developed by the Australian -Swedish Recoil Spectrometry Collaboration for quantitative analysis of the wealth of information in Time of Flight (ToF) and energy dispersive Recoil Spectrometry. An overview is presented of progress made in the use of multivariate techniques for energy calibration, separation of mass-overlapped signals and simulation of ToF-E data. 6 refs., 5 figs.

  3. Multivariate techniques of analysis for ToF-E recoil spectrometry data

    International Nuclear Information System (INIS)

    Multivariate statistical methods are being developed by the Australian -Swedish Recoil Spectrometry Collaboration for quantitative analysis of the wealth of information in Time of Flight (ToF) and energy dispersive Recoil Spectrometry. An overview is presented of progress made in the use of multivariate techniques for energy calibration, separation of mass-overlapped signals and simulation of ToF-E data. 6 refs., 5 figs

  4. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  5. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  6. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  7. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  8. Progress in thick-film pad printing technique for solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Hahne, P.; Hirth, E.; Reis, I.E. [Fraunhofer Institute for Solar Energy Systems ISE, Oltmannstr. 5, D-79100 Freiburg (Germany); Schwichtenberg, K. [Gebrueder Maerklin and Cie GmbH, Stuttgarter Str. 55-57, D-73033 Goeppingen (Germany); Richtering, W.; Horn, F.M. [Macromolecular Chemistry, Albert-Ludwigs-University Freiburg, Stefan-Meier-Str. 31, D-79104 Freiburg (Germany); Eggenweiler, U. [Kristallografisches Institut, Albert-Ludwigs-University Freiburg, Hebelstr. 25, D-79104 Freiburg (Germany)

    2001-01-01

    The aim of this work was to study the suitability of pad printing in connection with fine-line printing on solar cells. Pad printing is a kind of gravure offset printing technique that offers the possibility of a simple, economic and high throughput production of fine lines up to 32{mu}m even on uneven surfaces, which is not possible with traditional screen printing (Hahne et al., Proceedings of the Second World Conference on Photovoltaic Solar Energy Conversion, Vienna, 1998, p. 1646). The different inks and silicone rubber pads have been characterised by several methods like thermal analysis, rheological, hardness and surface tension measurement. Simple solar cells on multicrystalline wafers with rapid thermal sintering show efficiencies up to 13.4%.

  9. Key Point Based Data Analysis Technique

    Science.gov (United States)

    Yang, Su; Zhang, Yong

    In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.

  10. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; Moseley, S. H.; Porst, J.-P.; Porter, F. S.; Sadleir, J. E.; Smith, S. J.

    2016-07-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an ^{55}Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  11. Analysis and calibration techniques for superconducting resonators

    Science.gov (United States)

    Cataldo, Giuseppe; Wollack, Edward J.; Barrentine, Emily M.; Brown, Ari D.; Moseley, S. Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  12. Adhesive Characterization and Progressive Damage Analysis of Bonded Composite Joints

    Science.gov (United States)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2014-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  13. Medical Image Analysis Using Unsupervised and Supervised Classification Techniques

    OpenAIRE

    Prof. V.Joseph Peter,; Dr. M. Karnan

    2013-01-01

    The evolution of digital computers as well as the development of modern theories for learning and information processing leads to the emergence of Computational Intelligence (CI) engineering. Liver surgery remains a difficult challenge in which preoperative data analysis and strategy definition may play a significant role in the success of the procedure. Extraction of liver fibrosis is done using image enhancement techniques using various filtering techniques, unsupervised clustering techniqu...

  14. CO MPARATIVE STUDY OF CLUSTERING TECHNIQUES IN MULTIVARIATE DATA ANALYSIS

    OpenAIRE

    Sabba Ruhi; Md. Shamim Reza

    2015-01-01

    In present, Clustering techniques is a standard tool in several exploratory pattern - analysis, grouping, decision making, and machine - learning situations; including data mining, document retrieval, image segmentation, pattern recognition and in the field of artificial intelligenc e. In this study we have compared five different types of clustering techniques such as Fuzzy clustering, K - Means clustering, Hierarc...

  15. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  16. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically. The effici...

  17. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  18. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    OpenAIRE

    Mohammed naved Khan

    2013-01-01

    Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among re...

  19. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  20. The San Pedro Mártir Open Cluster Survey: Progress, Techniques, Preliminary Results

    Science.gov (United States)

    Schuster, W.; Michel, R.; Dias, W.; Tapia-Peralta, T.; Vázquez, R.; Macfarland, J.; Chavarría, C.; Santos, C.; Moitinho, A.

    2007-05-01

    A CCD UBVRI survey of northern open clusters is being undertaken at San Pedro Mártir, Mexico, and performed using always the same instrumental setup (telescope, CCD, filters), reduction methods, and system of standards (Landolt). To date more than 300 clusters (mostly unstudied previously) have been observed, and about half the data reduced using aperture-photometry and PSF techniques. Our analysis procedures are being refined by studying in detail a small subset of these clusters. For example, the heavily reddened clusters Be80 and Be95 are being examined in the color-color diagrams: (B-V,U-B) and (B-V,R-I) to better understand the problems of curvature and variable reddening. For clusters for which our U data reaches the F-type stars, such as NGC2192 and NGC7296, techniques are being examined for estimating both the reddening E(B-V) and metallicity [Fe/H] via the use of the (U-B) excess. If the clusters also have "red clump" stars, such as NGC1798 and Do02, these procedures can be iterated between the clump and main sequence stars to establish even better the values of E(B-V) and [Fe/H]. Finally, color-magnitude diagrams, such as (B-V,V) and (V-I,V), are being employed together with the Schmidt-Kaler colors and Padova isochrones to obtain distances and ages for these clusters. A java-based computer program is being developed to help in the visualization and analysis of these photometric data. This system is capable of displaying each cluster simultaneously in different color-color and color-magnitude diagrams and has an interactive way to identify a star, or group of stars, in one diagram and to see were it falls in the other diagrams, facilitating the elimination of field stars and the apperception of cluster features. This program is capable of displaying up to 16 different diagrams for one cluster and processing up to 20 clusters at the same time. Our aims are the following: (1) a common UBVRI photometric scale for open clusters, (2) an atlas of color

  1. Hyphenated techniques and their applications in natural products analysis.

    Science.gov (United States)

    Sarker, Satyajit D; Nahar, Lutfun

    2012-01-01

    A technique where a separation technique is coupled with an online spectroscopic detection technology is known as hyphenated technique, e.g., GC-MS, LC-PDA, LC-MS, LC-FTIR, LC-NMR, LC-NMR-MS, and CE-MS. Recent advances in hyphenated analytical techniques have remarkably widened their applications to the analysis of complex biomaterials, especially natural products. This chapter focuses on the applications of hyphenated techniques to pre-isolation and isolation of natural products, dereplication, online partial identification of compounds, chemotaxonomic studies, chemical finger-printing, quality control of herbal products, and metabolomic studies, and presents specific examples. However, a particular emphasis has been given on the hyphenated techniques that involve an LC as the separation tool. PMID:22367902

  2. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  3. Basic Sequence Analysis Techniques for Use with Audit Trail Data

    Science.gov (United States)

    Judd, Terry; Kennedy, Gregor

    2008-01-01

    Audit trail analysis can provide valuable insights to researchers and evaluators interested in comparing and contrasting designers' expectations of use and students' actual patterns of use of educational technology environments (ETEs). Sequence analysis techniques are particularly effective but have been neglected to some extent because of real…

  4. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  5. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  6. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    Science.gov (United States)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  7. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  8. Diffusion tensor analysis of corpus callosum in progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Shoichi; Makino, Takahiro; Shirai, Wakako; Hattori, Takamichi [Department of Neurology, Graduate School of Medicine, Chiba University (Japan)

    2008-11-15

    Progressive supranuclear palsy (PSP) is a neurodegenerative disease featuring parkinsonism, supranuclear ophthalmoplegia, dysphagia, and frontal lobe dysfunction. The corpus callosum which consists of many commissure fibers probably reflects cerebral cortical function. Several previous reports showed atrophy or diffusion abnormalities of anterior corpus callosum in PSP patients, but partitioning method used in these studies was based on data obtained in nonhuman primates. In this study, we performed a diffusion tensor analysis using a new partitioning method for the human corpus callosum. Seven consecutive patients with PSP were compared with 29 age-matched patients with Parkinson's Disease (PD) and 19 age-matched healthy control subjects. All subjects underwent diffusion tensor magnetic resonance imaging, and the corpus callosum was partitioned into five areas on the mid-sagittal plane according to a recently established topography of human corpus callosum (CC1-prefrontal area, CC2-premotor and supplementary motor area, CC3-motor area, CC4-sensory area, CC5-parietal, temporal, and occipital area). Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) were measured in each area and differences between groups were analyzed. In the PSP group, FA values were significantly decreased in CC1 and CC2, and ADC values were significantly increased in CC1 and CC2. Receiver operating characteristic analysis showed excellent reliability of FA and ADC analyses of CC1 for differentiating PSP from PD. The anterior corpus callosum corresponding to the prefrontal, premotor, and supplementary motor cortices is affected in PSP patients. This analysis can be an additional test for further confirmation of the diagnosis of PSP.

  9. Diffusion tensor analysis of corpus callosum in progressive supranuclear palsy

    International Nuclear Information System (INIS)

    Progressive supranuclear palsy (PSP) is a neurodegenerative disease featuring parkinsonism, supranuclear ophthalmoplegia, dysphagia, and frontal lobe dysfunction. The corpus callosum which consists of many commissure fibers probably reflects cerebral cortical function. Several previous reports showed atrophy or diffusion abnormalities of anterior corpus callosum in PSP patients, but partitioning method used in these studies was based on data obtained in nonhuman primates. In this study, we performed a diffusion tensor analysis using a new partitioning method for the human corpus callosum. Seven consecutive patients with PSP were compared with 29 age-matched patients with Parkinson's Disease (PD) and 19 age-matched healthy control subjects. All subjects underwent diffusion tensor magnetic resonance imaging, and the corpus callosum was partitioned into five areas on the mid-sagittal plane according to a recently established topography of human corpus callosum (CC1-prefrontal area, CC2-premotor and supplementary motor area, CC3-motor area, CC4-sensory area, CC5-parietal, temporal, and occipital area). Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) were measured in each area and differences between groups were analyzed. In the PSP group, FA values were significantly decreased in CC1 and CC2, and ADC values were significantly increased in CC1 and CC2. Receiver operating characteristic analysis showed excellent reliability of FA and ADC analyses of CC1 for differentiating PSP from PD. The anterior corpus callosum corresponding to the prefrontal, premotor, and supplementary motor cortices is affected in PSP patients. This analysis can be an additional test for further confirmation of the diagnosis of PSP

  10. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  11. Design, data analysis and sampling techniques for clinical research

    OpenAIRE

    Karthik Suresh; Thomas, Sanjeev V.; Geetha Suresh

    2011-01-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains...

  12. Microarray Analysis Techniques Singular Value Decomposition and Principal Component Analysis

    CERN Document Server

    Wall, M E; Rocha, L M; Wall, Michael E.; Rechtsteiner, Andreas; Rocha, Luis M.

    2002-01-01

    This chapter describes gene expression analysis by Singular Value Decomposition (SVD), emphasizing initial characterization of the data. We describe SVD methods for visualization of gene expression data, representation of the data using a smaller number of variables, and detection of patterns in noisy gene expression data. In addition, we describe the precise relation between SVD analysis and Principal Component Analysis (PCA) when PCA is calculated using the covariance matrix, enabling our descriptions to apply equally well to either method. Our aim is to provide definitions, interpretations, examples, and references that will serve as resources for understanding and extending the application of SVD and PCA to gene expression analysis.

  13. The analysis of training needs: Methods and techniques

    OpenAIRE

    V. Carbone

    2002-01-01

    After placing the phenomenon within its conceptual framework, the following section will present the development of policies to anticipate training needs in European countries. The next stage will consist in interpreting needs analysis as a planning tool to aid those involved in training and it is with reference to this definition that the most widely used techniques for TNA are addressed, examining the factors that determine the choice of the proper technique. Subsequently, attention will be...

  14. Targeting Ion Beam Analysis techniques for gold artefacts

    OpenAIRE

    Demortier, Guy

    2012-01-01

    The present study discusses the best experimental conditions for the quantitative analysis of gold jewellery artefacts by ion beam techniques (PIXE, RBS, PIGE and NRA). Special attention is given to the detection of enhancement or depletion below the surface, down to 10 microns, without any sampling or destruction. PIXE is certainly the most interesting technique for this purpose and the optimal geometrical arrangement of the experiment is described: orientation of the incident beam relative ...

  15. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  16. Learning Progressions and Teaching Sequences: A Review and Analysis

    Science.gov (United States)

    Duschl, Richard; Maeng, Seungho; Sezen, Asli

    2011-01-01

    Our paper is an analytical review of the design, development and reporting of learning progressions and teaching sequences. Research questions are: (1) what criteria are being used to propose a "hypothetical learning progression/trajectory" and (2) what measurements/evidence are being used to empirically define and refine a "hypothetical learning…

  17. An integrated technique for the analysis of skin bite marks.

    Science.gov (United States)

    Bernitz, Herman; Owen, Johanna H; van Heerden, Willie F P; Solheim, Tore

    2008-01-01

    The high number of murder, rape, and child abuse cases in South Africa has led to increased numbers of bite mark cases being heard in high courts. Objective analysis to match perpetrators to bite marks at crime scenes must be able to withstand vigorous cross-examination to be of value in conviction of perpetrators. An analysis technique is described in four stages, namely determination of the mark to be a human bite mark, pattern association analysis, metric analysis and comparison with the population data, and illustrated by a real case study. New and accepted techniques are combined to determine the likelihood ratio of guilt expressed as one of a range of conclusions described in the paper. Each stage of the analysis adds to the confirmation (or rejection) of concordance between the dental features present on the victim and the dentition of the suspect. The results illustrate identification to a high degree of certainty. PMID:18279256

  18. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  19. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  20. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  1. Dynamic analysis of large structures by modal synthesis techniques.

    Science.gov (United States)

    Hurty, W. C.; Hart, G. C.; Collins, J. D.

    1971-01-01

    Several criteria that may be used to evaluate the merits of some of the existing techniques for the dynamic analysis of large structures which involve division into substructures or components are examined. These techniques make use of component displacement modes to synthetize global systems of generalized coordinates and, for that reason, they have come to be known as modal synthesis or component mode methods. Two techniques have been found to be particularly useful - i.e., the modal synthesis method with fixed attachment modes, and the modal synthesis method with free attachment modes. These two methods are treated in detail, and general flow charts are presented for guidance in computer programming.

  2. IAEA progress report II - Study of archeological objects using PIXE analytical technique

    International Nuclear Information System (INIS)

    This is the second IAEA progress report for the period 2006-2007 (CRP number F23023). After adopting the PIXE one-run measurement using the Al funny filter as X-ray absorber which was described in the first progress report, two studies on ceramics were undertaken in order to be characterized based on their chemical composition. The first one concerned the characterization of 38 sherds from the locality of Ch'him (south of Beirut) that could help for future studies on ceramic provenance. Those samples are considered as reference materials as they are coming from kiln and workshop of the excavated site. The second study will be detailed in the current report. It concerned excavated pottery from Beirut, suspected to belong to North-Syrian production. (author)

  3. Instrumental neutron activation analysis of archaeological ceramics: Progress and challenges

    International Nuclear Information System (INIS)

    Instrumental neutron activation analysis (INAA) has become a widely used technique in the characterization of archaeological ceramics. Early applications, primarily developmental in nature, have been replaced by studies that attempt to place the derived analytical data in a larger archaeological and behavioural context. If archaeology is to increasingly exploit the great potential of INAA to address a broad range of social and cultural issues involving ceramics, greater attention must be paid to the various sources of variation that contribute to the chemical composition of pottery. In addition, as all research takes place within a social context, efforts must be made to increase the communication between archaeologists and other scientists, as this will contribute to the fulfilment of the archaeological research objectives. (author)

  4. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  5. Characterisation of solar cells by ion beam analysis techniques

    International Nuclear Information System (INIS)

    Several ion beam analysis techniques were applied for the characterisation of amorphous (a- Si) and polycrystalline silicon solar cells. Thickness and composition of thin layers in thin film a-Si cells were analysed by RBS (Rutherford backscattering) using 5 MeV Li beam and ERDA (Elastic recoil detection analysis) using 12 MeV C beam. Nuclear microprobe technique IBIC (Ion beam induced charge) was used for imaging a charge collection efficiency of EFG (edge-defined film-fed grown) silicon in attempt to correlate charge loss with a spatial distribution of structural defects in the material. (author)

  6. Micro analysis of disolved gases by the gas chromatography technique

    International Nuclear Information System (INIS)

    A technique which allows the quantitative analysis of small concentration of disolved gases such as CO2 and H2 in the order of 10-6 - 10-3M is discussed. For the extraction, separation and quantification a Toepler pump was used. This is in tandem to a gas chromatography. This method also can be applied for the analysis of other gases like CO, CH4, CH3-CH3 etc. This technique may be applied in fields such as radiation chemistry, oceanography and environmental studies. (author)

  7. Recent progress on HYSPEC, and its polarization analysis capabilities

    Directory of Open Access Journals (Sweden)

    Winn Barry

    2015-01-01

    Full Text Available HYSPEC is a high-intensity, direct-geometry time-of-flight spectrometer at the Spallation Neutron Source, optimized for measurement of excitations in small single-crystal specimens with optional polarization analysis capabilities. The incident neutron beam is monochromated using a Fermi chopper with short, straight blades, and is then vertically focused by Bragg scattering onto the sample position by either a highly oriented pyrolitic graphite (unpolarized or a Heusler (polarized crystal array. Neutrons are detected by a bank of 3He tubes that can be positioned over a wide range of scattering angles about the sample axis. HYSPEC entered the user program in February 2013 for unpolarized experiments, and is already experiencing a vibrant research program. Polarization analysis will be accomplished by using the Heusler crystal array to polarize the incident beam, and either a 3He spin filter or a supermirror wide-angle polarization analyser to analyse the scattered beam. The 3He spin filter employs the spin-exchange optical pumping technique. A 60∘ wide angle 3He cell that matches the detector coverage will be used for polarization analysis. The polarized gas in the post-sample wide angle cell is designed to be periodically and automatically refreshed with an adjustable pressure of polarized gas, optically pumped in a separate cell and then transferred to the wide angle cell. The supermirror analyser has 960 supermirror polarizers distributed over 60∘, and has been characterized at the Swiss Spallation Neutron Source. The current status of the instrument and the development of its polarization analysis capabilities are presented.

  8. Analysis of progression of severe accident in Indian PHWRs

    International Nuclear Information System (INIS)

    In India a wide variety of nuclear reactors are in operation and in different stages of construction. The main stay of Indian nuclear power programme today is 'Pressurised Heavy Water Reactors (PHWRs)'. There are 13 operating PHWRs and several others in different stages of construction. These reactors are either of 220 MWe or 540 MWe capacity. Atomic Energy Regulatory Board for authorization needs safety analysis reports, which consists of a detailed analyses of all design basis accidents. However, there is a more to carry out severe accident analysis for accident management programme. This paper describes an analysis of a severe accident caused by Loss of Coolant Accident (LOCA), co-incident with loss of emergency core cooling system and loss of moderator heat sink in 220 MWe Indian PHWR. Initially in a matter of about 60 seconds most of the coolant from primary heat transport system blows out, the reactor gets tripped, but in the absence of emergency core cooling system, the heat removal from the fuel bundles is very poor. Consequently, the fuel bundle starts getting heated up. The only mode of heat transfer is radiative heat transfer from fuel bundle to pressure tube, from pressure tube to calandria tube and them convective heat transfer from calandria tube to the relatively cold moderator in which the reactor channels are immersed. In the absence of availability of moderator heat sink the moderator gets heated up and eventually boils. And slowly the moderator level in the calandria starts falling. Soon the channel gets uncovered and the temperatures of the channel components shoot up as the temperature of the pressure tube and calandria tube rise. Mechanical properties deteriorate rapidly with temperature as structural elements of reactor channel are made of zircaloy. Under the weight of the fuel, the reactor channel gives in and falls into the remaining moderator. This process continues till all the moderator is evaporated, leading to damage to the entire

  9. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  10. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  11. No evidence of real progress in treatment of acute pain, 1993–2012: scientometric analysis

    Directory of Open Access Journals (Sweden)

    Correll DJ

    2014-04-01

    Full Text Available Darin J Correll, Kamen V Vlassakov, Igor Kissin Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA Abstract: Over the past 2 decades, many new techniques and drugs for the treatment of acute pain have achieved widespread use. The main aim of this study was to assess the progress in their implementation using scientometric analysis. The following scientometric indices were used: 1 popularity index, representing the share of articles on a specific technique (or a drug relative to all articles in the field of acute pain; 2 index of change, representing the degree of growth in publications on a topic compared to the previous period; and 3 index of expectations, representing the ratio of the number of articles on a topic in the top 20 journals relative to the number of articles in all (>5,000 biomedical journals covered by PubMed. Publications on specific topics (ten techniques and 21 drugs were assessed during four time periods (1993–1997, 1998–2002, 2003–2007, and 2008–2012. In addition, to determine whether the status of routine acute pain management has improved over the past 20 years, we analyzed surveys designed to be representative of the national population that reflected direct responses of patients reporting pain scores. By the 2008–2012 period, popularity index had reached a substantial level (≥5% only with techniques or drugs that were introduced 30–50 years ago or more (epidural analgesia, patient-controlled analgesia, nerve blocks, epidural analgesia for labor or delivery, bupivacaine, and acetaminophen. In 2008–2012, promising (although modest changes of index of change and index of expectations were found only with dexamethasone. Six national surveys conducted for the past 20 years demonstrated an unacceptably high percentage of patients experiencing moderate or severe pain with not even a trend toward outcome improvement. Thus

  12. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  13. Data analysis techniques for nuclear and particle physicists

    CERN Document Server

    Pruneau, Claude

    2016-01-01

    This is an advanced data analysis textbook for scientists specializing in the areas of particle physics, nuclear physics, and related subfields. As a practical guide for robust, comprehensive data analysis, it focuses on realistic techniques to explain instrumental effects. The topics are relevant for engineers, scientists, and astroscientists working in the fields of geophysics, chemistry, and the physical sciences. The book serves as a reference for more senior scientists while being eminently accessible to advanced undergraduate and graduate students.

  14. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  15. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  16. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  17. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  18. Detection and analysis of explosives by nuclear techniques

    International Nuclear Information System (INIS)

    In today's global environment of international terrorism, there is a well recognised need for sensitive and specific techniques for detection and analysis of explosives. Sensitivity is needed for detection of small amounts of explosives hidden or mixed with post explosion residues. Specificity is needed in order to avoid response, to non-explosive substances. The conventional techniques for detection of explosives based on vapour detection either by sniffer dogs or by some instrumental vapour detectors have limitations because of their inability to detect plastic bonded explosives which has vapour pressure significantly lower than pure explosives. The paper reviews the various nuclear techniques such as thermal, fast and pulsed fast neutron activation analysis which are used for detection of pure explosives, mixtures of explosives, and also aluminized and plastic bonded explosives. (author)

  19. Review of geographic processing techniques applicable to regional analysis

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, R.C.

    1988-02-01

    Since the early 1970s regional environmental studies have been carried out at the Oak Ridge National Laboratory using computer-assisted techniques. This paper presents an overview of some of these past experiences and the capabilities developed at the Laboratory for processing, analyzing, and displaying geographic data. A variety of technologies have resulted such as computer cartography, image processing, spatial modeling, computer graphics, data base management, and geographic information systems. These tools have been used in a wide range of spatial applications involving facility siting, transportation routing, coal resource analysis, environmental impacts, terrain modeling, inventory development, demographic studies, water resource analyses, etc. The report discusses a number of topics dealing with geographic data bases and structures, software and processing techniques, hardware systems, models and analysis tools, data acquisition techniques, and graphical display methods. Numerous results from many different applications are shown to aid the reader interested in using geographic information systems for environmental analyses. 15 refs., 64 figs., 2 tabs.

  20. Analysis of Dynamic Road Traffic Congestion Control (DRTCC Techniques

    Directory of Open Access Journals (Sweden)

    Pardeep Mittal

    2015-10-01

    Full Text Available : Dynamic traffic light control at intersection has become one of the most active research areas to develop the Dynamic transportation systems (ITS. Due to the consistent growth in urbanization and traffic congestion, such a system was required which can control the timings of traffic lights dynamically with accurate measurement of traffic on the road. In this paper, analysis of all the techniques that has been developed to automate the traffic lights has been done.. The efficacy of all the techniques has been evaluated, using MATLAB software. After comparison of artificial intelligent techniques , it is found that image mosaicking technique is quite effective (in terms of improving moving time and reducing waiting time for the control of the traffic signals to control congestion on the road.

  1. Neutron noise analysis techniques in nuclear power reactors

    International Nuclear Information System (INIS)

    The main techniques used in neutron noise analysis of BWR and PWR nuclear reactors are reviewed. Several applications such as control of vibrations in both reactor types, determination of two phase flow parameters in BWR and stability control in BWR are discussed with some detail. The paper contains many experimental results obtained by the main author of this paper. (author)

  2. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  3. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    Science.gov (United States)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  4. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  5. Role of nuclear analytical techniques in environmental analysis

    International Nuclear Information System (INIS)

    Nuclear analytical techniques play an important role in the protection of human health from biological, chemical and radiological hazards in the environment. They are highly useful in the areas of environmental sciences such as air pollution, environmental chemistry, environmental management, environmental toxicology, industrial hygiene, marine pollution and water quality. The socio-economic needs of all nations are closely linked to its industrial activities, with energy sector as the driving force, making it sustainable without causing much constraints to the nature's ability to meet the present and future needs. At the same time it should be realized that most of the industrial production options are accrued with some degree of environmental impacts in the form of increased concentrations in air, soil, vegetation, surface and ground water resources, marine coastal zones, sediment etc., which inter alia can lead to potential health and environmental risks through inhalation and ingestion pathways. In the nuclear facilities, most of its harmful effects have been minimized by control measures like shielding, engineered safety systems, environmental surveillance etc., and remain under check due to the strict implementation of the comprehensive, continuous and stringent regulatory system. The development of nuclear analytical techniques is based on utilization of certain properties of nucleus and associated with the phenomena of ionizing radiations. The analytical techniques that use nuclear instrumentation are also called nuclear analytical techniques. Nuclear analytical techniques are basically divided in two categories i.e. those are based on direct methods and those based on indirect methods. Direct methods include beta counting, gamma spectrometry, alpha spectrometry, emanometry etc. while Indirect methods include Instrumental Neutron Activation Analysis, Radiochemical Neutron Activation Analysis, Prompt Gamma Analysis, Charged Particle Activation Analysis

  6. Driving forces of change in environmental indicators an analysis based on divisia index decomposition techniques

    CERN Document Server

    González, Paula Fernández; Presno, Mª José

    2014-01-01

    This book addresses several index decomposition analysis methods to assess progress made by EU countries in the last decade in relation to energy and climate change concerns. Several applications of these techniques are carried out in order to decompose changes in both energy and environmental aggregates. In addition to this, a new methodology based on classical spline approximations is introduced, which provides useful mathematical and statistical properties. Once a suitable set of determinant factors has been identified, these decomposition methods allow the researcher to quantify the respec

  7. Profile likelihood ratio analysis techniques for rare event signals

    CERN Document Server

    Billard, J

    2013-01-01

    The Cryogenic Dark Matter Search (CDMS) II uses crystals operated at milliKelvin temperature to search for dark matter. We present the details of the profile likelihood analysis of 140.2 kg-day exposure from the final data set of the CDMS II Si detectors that revealed three WIMP-candidate events. We found that this result favors a WIMP+background hypothesis over the known-background-only hypothesis at the 99.81% confidence level. This paper is dedicated to the description of the profile likelihood analysis dedicated to the CDMSII-Si data and discusses such analysis techniques in the scope of rare event searches.

  8. Investigation of total reflection X-ray fluorescence analysis technique

    International Nuclear Information System (INIS)

    Total-Reflection X-ray Fluorescence spectrometry (TRXF) is known for its high sensitivity down to Pg-level or sub ppb level, respectively. Therefore the spectrometry is considered as a most competitive tool in the application of trace element analysis. The technique of TRXF was investigated in the laboratory. But small isotope X-γ source is chosen as an exciting source instead of general X-ray tube. From the primitive experiment the conclusion proved that the condition of total reflection can be built and the analysis sensitivity of TRXF is higher than that of normal x-ray analysis

  9. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...

  10. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  11. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  12. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    Science.gov (United States)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  13. Research progress on the brewing techniques of new-type rice wine.

    Science.gov (United States)

    Jiao, Aiquan; Xu, Xueming; Jin, Zhengyu

    2017-01-15

    As a traditional alcoholic beverage, Chinese rice wine (CRW) with high nutritional value and unique flavor has been popular in China for thousands of years. Although traditional production methods had been used without change for centuries, numerous technological innovations in the last decades have greatly impacted on the CRW industry. However, reviews related to the technology research progress in this field are relatively few. This article aimed at providing a brief summary of the recent developments in the new brewing technologies for making CRW. Based on the comparison between the conventional methods and the innovative technologies of CRW brewing, three principal aspects were summarized and sorted, including the innovation of raw material pretreatment, the optimization of fermentation and the reform of sterilization technology. Furthermore, by comparing the advantages and disadvantages of these methods, various issues are addressed related to the prospect of the CRW industry. PMID:27542505

  14. Reduced Incidence of Slowly Progressive Heymann Nephritis in Rats Immunized With a Modified Vaccination Technique

    Directory of Open Access Journals (Sweden)

    Arpad Z. Barabas

    2006-01-01

    Full Text Available A slowly progressive Heymann nephritis (SPHN was induced in three groups of rats by weekly injections of a chemically modified renal tubular antigen in an aqueous medium. A control group of rats received the chemically unmodified version of the antigen in an aqueous solution. One group of SPHN rats were pre- and post-treated with weekly injections of IC made up of rKF3 and rarKF3 IgM antibody at antigen excess (MIC (immune complexes [ICs] containing sonicated ultracentrifuged [u/c] rat kidney fraction 3 [rKF3] antigen and IgM antibodies specific against the antigen, at slight antigen excess. One group of SPHN rats were post-treated with MIC 3 weeks after the induction of the disease and one group of SPHN animals received no treatment. The control group of rats received pre- and post-treatment with sonicated u/c rKF3.

  15. Treatment planning of adhesive additive rehabilitations: the progressive wax-up of the three-step technique.

    Science.gov (United States)

    Vailati, Francesca; Carciofo, Sylvain

    2016-01-01

    A full-mouth rehabilitation should be correctly planned from the start by using a diagnostic wax-up to reduce the potential for remakes, increased chair time, and laboratory costs. However, determining the clinical validity of an extensive wax-up can be complicated for clinicians who lack the experience of full-mouth rehabilitations. The three-step technique is a simplified approach that has been developed to facilitate the clinician's task. By following this technique, the diagnostic wax-up is progressively developed to the final outcome through the interaction between patient, clinician, and laboratory technician. This article provides guidelines aimed at helping clinicians and laboratory technicians to become more proactive in the treatment planning of full-mouth rehabilitations, by starting from the three major parameters of incisal edge position, occlusal plane position, and the vertical dimension of occlusion. PMID:27433550

  16. Progress in development of a technique to measure the axial thermal diffusivity of irradiated reactor fuel pellets

    Energy Technology Data Exchange (ETDEWEB)

    Hutcheon, R.; Mouris, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    1997-07-01

    A new technique, based on pulsed high-energy ({approx}12 MeV) electron-beam heating, is being developed for measuring the thermal diffusivity of irradiated reactor fuel. This paper reports on the continuing development work required to establish a practical technique for irradiated materials at high temperatures (1000 to 1500 deg C). This includes studies of the influence of thermocouple surface contact resistance, of the sheath and the pellet mounting system, of internal cracks in the pellet, and of the chamber atmosphere. Calibrations with a NIST standard and measurements on fresh UO{sub 2} were done. Progress during the past year in these various areas is reviewed, and initial experiments with a specimen of high-burnup CANDU fuel are discussed. (author)

  17. Nondestructive analysis of oil shales with PGNAA technique

    International Nuclear Information System (INIS)

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water

  18. Nondestructive analysis of oil shales with PGNAA technique

    Energy Technology Data Exchange (ETDEWEB)

    Maly, J.; Bozorgmanesh, H.

    1984-02-01

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water.

  19. A Review on Clustering and Outlier Analysis Techniques in Datamining

    Directory of Open Access Journals (Sweden)

    S. Koteeswaran

    2012-01-01

    Full Text Available Problem statement: The modern world is based on using physical, biological and social systems more effectively using advanced computerized techniques. A great amount of data being generated by such systems; it leads to a paradigm shift from classical modeling and analyses based on basic principles to developing models and the corresponding analyses directly from data. The ability to extract useful hidden knowledge in these data and to act on that knowledge is becoming increasingly important in today's competitive world. Approach: The entire process of applying a computer-based methodology, including new techniques, for discovering knowledge from data is called data mining. There are two primary goals in the data mining which are prediction and classification. The larger data involved in the data mining requires clustering and outlier analysis for reducing as well as collecting only useful data set. Results: This study is focusing the review of implementation techniques, recent research on clustering and outlier analysis. Conclusion: The study aims for providing the review of clustering and outlier analysis technique and the discussion on the study will guide the researcher for improving their research direction.

  20. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  1. New Progress in High-Precision and High-Resolution Seismic Exploration Technique in Coal Industry of China

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    In the past twenty years, the proportion of coal in primary-energy consumption in China is generally between 71.3% and 76.5%. The output of coal was 1.374 billion tons in 1996, and 1.21 tons in 1998, which ranked first in the world. Now coal is mined mainly with mechanization in China, which is planned to reach 80% in major State-owned coal mines in 2000 according to the planning of the government (Li et al., 1998; Tang Dejin, 1998).Compared with the USA and Australia, China has more complex coal geological structures. Based on high-resolution seismic technique in coal exploration, a new seismic technique with high-precision and high-resolution (2-D and 3-D) has been developed for the purpose of detecting small geological structures in coal mine construction and production to meet the needs of large-scale popularization of mechanized coal mining in China. The technique is low in cost and requires a relatively short period of exploration, with high precision and wide-range applications. In the middle of the 1980s, it began to be used in pre-mining coal exploration on a trial basis, and entered the peak of exploration in the 1990s, which has made significant progress in providing high-precision geological results for the construction and production of coal industry in China, and is still in the ascendant.This paper discusses some new progress and the exploration capability and application range of the technique.

  2. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  3. Simulation of microsegregation and the solid/liquid interface progression in the concentric solidification technique

    International Nuclear Information System (INIS)

    A concentric solidification technique was employed to simulate experimentally the segregation of alloying elements during solidification at the centerline of continuously cast steel. Microstructural development of low carbon steel upon solidification has been observed in situ in a laser-scanning confocal microscope. Microscopic analyses following in situ observations, demonstrate that segregation occurring at steel slabs can reasonably be simulated by the use of the concentric solidification technique. The validity of these experimental simulations has been correlated with mathematical analyses using the Thermo-Calc and DICTRA (Diffusion Controlled Transformation) modeling tools. The effect of cooling rate on the sequence of events during solidification of Fe–0.18%C and Fe–4.2 wt%Ni peritectic alloys was studied and compared with the experimental observations

  4. Simulation of microsegregation and the solid/liquid interface progression in the concentric solidification technique

    Science.gov (United States)

    Aminorroaya, Sima; Reid, Mark; Dippenaar, Rian

    2011-03-01

    A concentric solidification technique was employed to simulate experimentally the segregation of alloying elements during solidification at the centerline of continuously cast steel. Microstructural development of low carbon steel upon solidification has been observed in situ in a laser-scanning confocal microscope. Microscopic analyses following in situ observations, demonstrate that segregation occurring at steel slabs can reasonably be simulated by the use of the concentric solidification technique. The validity of these experimental simulations has been correlated with mathematical analyses using the Thermo-Calc and DICTRA (Diffusion Controlled Transformation) modeling tools. The effect of cooling rate on the sequence of events during solidification of Fe-0.18%C and Fe-4.2 wt%Ni peritectic alloys was studied and compared with the experimental observations.

  5. EFFECT OF PROGRESSIVE MUSCLE RELAXATION TECHNIQUE ON PAIN RELIEF DURING LABOR

    OpenAIRE

    M. Bagharpoosh; G. Sangestani M. Goodarzi

    2006-01-01

    Labor pain is a cause of stress and suffering; for which many women seek methods to relieve this pain. The aim of this study was to determine the effect of relaxation techniques on pain relief during labor. This study was carried out on 62 pregnant women referred to Fatemieh hospital (Hamadan, Iran) during their labor. They were selected using convenience sampling and were divided randomly in two groups. The first group (control) received routine way of ward during their labor and the second ...

  6. Types of Maize Virus Diseases and Progress in Virus Identification Techniques in China

    Institute of Scientific and Technical Information of China (English)

    Cui Yu; Zhang Ai-hong; Ren Ai-jun; Miao Hong-qin

    2014-01-01

    There are a total of more than 40 reported maize viral diseases worldwide. Five of them have reportedly occurred in China. They are maize rough dwarf disease, maize dwarf mosaic disease, maize streak dwarf disease, maize crimson leaf disease, maize wallaby ear disease and corn lethal necrosis disease. This paper reviewed their occurrence and distribution as well as virus identification techniques in order to provide a basis for virus identification and diagnosis in corn production.

  7. Progress report on reversal and substitute element technique for thread calibration on CMMs

    DEFF Research Database (Denmark)

    Carmignato, Simone; Larsen, Erik; Sobiecki, Rene; De Chiffre, Leonardo

    This report is made as a part of the project EASYTRAC, an EU project under the programme Competitive and Sustainable Growth: Contract No. G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines. For this......) - Germany and Tampere University of Technology (TUT) - Finland. The present report describes feasibility and preliminary results of a reversal and substitute element technique application for thread calibration....

  8. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  9. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    OpenAIRE

    Nanhay Singh; Achin Jain; Ram Shringar Raw

    2013-01-01

    Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper...

  10. Rain Attenuation Analysis using Synthetic Storm Technique in Malaysia

    Science.gov (United States)

    Lwas, A. K.; Islam, Md R.; Chebil, J.; Habaebi, M. H.; Ismail, A. F.; Zyoud, A.; Dao, H.

    2013-12-01

    Generated rain attenuation time series plays an important role for investigating the rain fade characteristics in the lack of real fade measurements. A suitable conversion technique can be applied to measured rain rate time series to produce rain attenuation data and be utilized to understand the rain fade characteristics. This paper focuses on applicability of synthetic storm technique (SST) to convert measured rain rate data to rain attenuation time series. Its performance is assessed for time series generation over a tropical location Kuala Lumpur, in Malaysia. From preliminary analysis, it is found that SST gives satisfactory results to estimate the rain attenuation time series from the rain rate measurements over this region.

  11. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  12. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  13. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  14. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Charlton, William S

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  15. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  16. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  17. Large areas elemental mapping by ion beam analysis techniques

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  18. Path Analysis and Causal Analysis: Variations of a Multivariate Technique of Measuring Student Politics

    Science.gov (United States)

    Braungart, Richard G.

    1975-01-01

    This paper tests a multivariate theory of family status, socialization, and student politics employing two different methodological techniques: (1) the popular path analysis method as compared with (2) a modified causal analysis approach. Results reveal that both techniques appear to be a reliable check on one another. (Editor/PG)

  19. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  20. Pulsed Photonuclear Assessment (PPA) Technique: CY 04 Year-end Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    J.L. Jones; W.Y. Yoon; K.J. Haskell; D.R. Norman; J.M. Zabriskie; J.W. Sterbentz; S.M. Watson; J.T. Johnson; B.D. Bennett; R.W. Watson; K. L. Folkman

    2005-05-01

    Idaho National Laboratory (INL), along with Los Alamos National Laboratory (LANL) and Idaho State University’s Idaho Accelerator Center (IAC), are developing an electron accelerator-based, photonuclear inspection technology for the detection of smuggled nuclear material within air-, rail-, and especially, maritime-cargo transportation containers. This CY04 report describes the latest developments and progress with the development of the Pulsed, Photonuclear Assessment (PPA) nuclear material inspection ystem, such as: (1) the identification of an optimal range of electron beam energies for interrogation applications, (2) the development of a new “cabinet safe” electron accelerator (i.e., Varitron II) to assess “cabinet safe-type” operations, (3) the numerical and experimental validation responses of nuclear materials placed within selected cargo configurations, 4) the fabrication and utilization of Calibration Pallets for inspection technology performance verification, 5) the initial technology integration of basic radiographic “imaging/mapping” with induced neutron and gamma-ray detection, 6) the characterization of electron beam-generated photon sources for optimal performance, 7) the development of experimentallydetermined Receiver-Operator-Characterization curves, and 8) several other system component assessments. This project is supported by the Department of Homeland Security and is a technology component of the Science & Technology Active Interrogation Portfolio entitled “Photofission-based Nuclear Material Detection and Characterization.”

  1. Free liquid membranes - a novel progressive concept of phase interfaces for electrically enhanced microextraction techniques

    Czech Academy of Sciences Publication Activity Database

    Kubáň, Pavel; Boček, Petr

    Grupo VLS Print Solution, 2014 - (Guzman, N.; Taveres, M.). s. 64-64 [ITP & LACE 2014. International Symposium on Electro- and Liquid Phase-Separation Techniques /21./ and Latin-American Symposium on Biotechnology, Biomedical, Biopharmaceutical, and Industrial Applications of Capillary Electrophoresis and Microchip Technology /20./. 04.10.2014-08.10.2014, Natal] R&D Projects: GA ČR(CZ) GA13-05762S Institutional support: RVO:68081715 Keywords : free liquid membranes * microextractions * complex samples Subject RIV: CB - Analytical Chemistry, Separation

  2. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  3. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group the...... methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between...

  4. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    Science.gov (United States)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  5. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    OpenAIRE

    Demyanenko DV; Demyanenko VG; Breusova SV

    2016-01-01

    Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS) causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and...

  6. Ion beam analysis techniques applied to large scale pollution studies

    International Nuclear Information System (INIS)

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 μm particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs

  7. Analysis of Dynamic Road Traffic Congestion Control (DRTCC) Techniques

    OpenAIRE

    Pardeep Mittal; Yashpal Singh,; Yogesh Sharma

    2015-01-01

    : Dynamic traffic light control at intersection has become one of the most active research areas to develop the Dynamic transportation systems (ITS). Due to the consistent growth in urbanization and traffic congestion, such a system was required which can control the timings of traffic lights dynamically with accurate measurement of traffic on the road. In this paper, analysis of all the techniques that has been developed to automate the traffic lights has been done.. The efficacy...

  8. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  9. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  10. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  11. Nuclear Spectral Analysis with Nonlinear Robust Fitting Techniques

    International Nuclear Information System (INIS)

    A new approach to nuclear spectral analysis based on nonlinear robust fitting techniques is described. Some of the fundamental differences in this approach from the more common methods of nuclear spectral analysis involving local peak searches are discussed. Although it requires knowledgeable interactive operation for best results and is computationally intensive, nuclear spectral analysis with nonlinear robust fitting has been shown to be capable of exceptional sensitivity in detecting weak radionuclides in the presence of strong interference and in noisy spectra, sparse spectra, and low-resolution spectra. This increased sensitivity is due to the simultaneous optimization of all the data for all the free variables of the analysis and the iterative construction of a well-determined continuum spanning the entire spectrum

  12. Nuclear Spectral Analysis with Nonlinear Robust Fitting Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lasche, G.P.; Coldwell, R.L.

    2001-06-17

    A new approach to nuclear spectral analysis based on nonlinear robust fitting techniques is described. Some of the fundamental differences in this approach from the more common methods of nuclear spectral analysis involving local peak searches are discussed. Although it requires knowledgeable interactive operation for best results and is computationally intensive, nuclear spectral analysis with nonlinear robust fitting has been shown to be capable of exceptional sensitivity in detecting weak radionuclides in the presence of strong interference and in noisy spectra, sparse spectra, and low-resolution spectra. This increased sensitivity is due to the simultaneous optimization of all the data for all the free variables of the analysis and the iterative construction of a well-determined continuum spanning the entire spectrum.

  13. Recent progress in structural integrity assessment techniques for components subject to service-induced degradation

    International Nuclear Information System (INIS)

    Nuclear power plant components are exposed to a wide range of environmental and loading conditions which can cause degradation over time. Aging embrittlement, erosion-corrosion, irradiation embrittlement, stress corrosion cracking, and corrosion fatigue are examples of aging mechanisms which could reduce structural margins in reactor components. The degradation effects from these mechanisms have been seen more frequently with the aging of the early nuclear plants. Since there is a strong incentive for keeping these older plants running for longer periods of time without compromising safety, proper plant management to minimize damage from degradation mechanisms is extremely important. Structural margin assessment, monitoring, and maintenance are important elements of such a management plan. Significant progress has been recently made in the understanding, evaluation and monitoring of these degradation mechanisms. This has led also to new requirements in the ASME Code design basis for nuclear plants. Current state of understanding and new developments in the ASME Code to address some of these degradation mechanisms are covered in this paper. Cast stainless steels used in pump casings and valve bodies have been known to experience thermal aging embrittlement at reactor operating temperatures. Recent predictive models of thermal aging effects on material toughness, developed at Argonne National Lab are reviewed and applied to assess ASME Code structural margins of a reactor pump casing. A recent ASME Code Case provides methods for the evaluation and acceptance criteria for reactor pressure vessels having ductile fracture toughness values reduced below the requirements of 10CFR50 due to irradiation embrittlement. Background and application of this code case to an older BWR vessel is described. The occurrence of stress corrosion cracking in austenitic stainless steel piping highlighted the need for evaluation methods for structural margin assessment in piping

  14. History of activation analysis technique with charged particles in Uzbekistan

    International Nuclear Information System (INIS)

    Full text: The researches on activation analysis with charged particles (CPAA) were started immediately after beginning of constructing of 150-cm cyclotron U-150 in 60-th years of last century. CPAA laboratory organized on bases of the cyclotron and neutron generator NG-200 (in following I-150) in 1971 existed up to the end of 1985. We have used Ion beams of these devices to elaborate two types of nuclear analysis techniques: 1. Delayed Nuclear Analysis (DNA) involving Charged Particle Activation Analysis (CPAA) and Fast Neutron Activation Analysis (FNAA); 2. Prompt Nuclear Analysis (PNA) involving the spectrometry of particles induced X-Ray emission (PIXE). DNA with using accelerators has the following subdivisions: 1. Proton Activation Analysis (PAA); 2. Deuteron Activation Analysis (DAA); 3. 3He Activation Analysis (3HeAA); 4. 4He Activation Analysis (4HeAA or α-AA); 5. Fast Neutron Activation Analysis (FNAA). PAA and DAA found wide application were used to derive a good sensitivity in determination of contents of more than 20 chemical elements in some materials of high purity. For example, we have applied these techniques for the determination of Li, B, C, N, O, F at level of 10-8 - 10-10 g/g in different high purity semiconductors (Si, SiC, Ge, AsGa, InP et al.), nonferrous metals (Li, Be, Zr, Nb, Mo, Ta, W, Re, Al, Ti etc.), nonconductive materials (different glasses, optical materials, diamonds et al.) and environmental objects (soil, plants, water). The techniques provided good results on the determination of B, C and N contents and others. 3HeAA and 4HeAA were generally used to determine of O and C contents in semiconductors ands metals of high purity. We have elaborated rapid radiochemical techniques for separation of short-lived positron emitters. For example, the separation of 15O, formatting by nuclear reaction 16O(3He,α)15O, the reducing fusion technique was used. Radionuclide 11C was separated chemically by the oxidisation of samples in the

  15. New laser spectroscopic technique for stable-isotope ratio analysis

    International Nuclear Information System (INIS)

    A new approach to stable-isotope ratio analysis based on atomic hyperfine structure is demonstrated. This laser spectroscopic scheme is virtually interference-free. A minor constituent in a complex matrix can be selectively analyzed without extensive sample preparation. A single-frequency tunable cw ring dye laser is used as the excitation source and a demountable cathode discharge is used as the atomizer and detector. Samples are electrodeposited on the demountable cathode and hyperfine profiles are collected by optogalvanic detection. By spectral deconvolution, the relative abundances of all isotopes present can be determined with good accuracy and precision. The technique is demonstrated for copper contents as low as 1.6 ppM, using the atomic hyperfine structure of Cu I 578.2 nm non-resonance transition. It is also successfully tested for analysis of copper isotopes in human blood. The sensitivity of doppler-free polarization spectroscopy in atomic flames is showed to be competitive with other sensitive laser techniques such as the fluorescence spectrometric methods. Improved detectability of polarization rotation and excellent suppression of flame background noise enable this method to achieve detection limits of parts per trillion levels of sodium and 37 ppB of barium. The spectral resolution is suitable for isotopic analysis, and the technique offers excellent selectivity and minimum spectral interference

  16. The Analysis of the Thematic Progression Patterns in "The Great Learning"

    Institute of Scientific and Technical Information of China (English)

    王利娜; 金俊淑

    2007-01-01

    This paper intends to introduce briefly the thematic progression patterns in Systemic- Functional Grammar, then analyze its application in "The Great Learning" which is one of the classics of the Confucius and his disciples. The analysis of the thematic progression patterns of "The Great Learning" is meaningful for both understanding and appreciating "The Great Learning".

  17. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    Directory of Open Access Journals (Sweden)

    Alexander Hexemer

    2015-01-01

    Full Text Available The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS, new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.

  18. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    Science.gov (United States)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  19. Human Capital Investment and an Analysis of Its Progressive Profit

    Institute of Scientific and Technical Information of China (English)

    张德平; 孙诚

    2004-01-01

    Skilled labor force cultivated through putting in funds and time in their education are undoubtedly essential in the operation of sophisticated machines in production, but it is so also in the creation of new ideas and methods in production and other economic activities, and ultimately in the promotion of the progressive increase of material capital. Thus strengthening the investment of human capital and enriching the stock of human capital is of primary importance, especially for China, in the 21st century.

  20. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  1. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C14 n-alkanes to C40 to C30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  2. Progress in bionic information processing techniques for an electronic nose based on olfactory models

    Institute of Scientific and Technical Information of China (English)

    LI Guang; FU Jun; ZHANG Jia; ZHENG JunBao

    2009-01-01

    As a novel bionic analytical technique, an electronic nose, inspired by the mechanism of the biological olfactory system and integrated with modern sensing technology, electronic technology and pattern recognition technology, has been widely used in many areas. Moreover, recent basic research findings in biological olfaction combined with computational neuroscience promote its development both in methodology and application. In this review, the basic information processing principle of biological olfaction and artificial olfaction are summarized and compared, and four olfactory models and their applications to electronic noses are presented. Finally, a chaotic olfactory neural network is detailed and the utilization of several biologically oriented learning rules and its spatiotemporal dynamic prop-ties for electronic noses are discussed. The integration of various phenomena and their mechanisms for biological olfaction into an electronic nose context for information processing will not only make them more bionic, but also perform better than conventional methods. However, many problems still remain, which should be solved by further cooperation between theorists and engineers.

  3. Techniques for getting the most from an evaluation: Review of methods and results for attributing progress, non-energy benefits, net to gross, and cost-benefit

    International Nuclear Information System (INIS)

    As background for several evaluation and attribution projects, the authors conducted research on best practices in a few key areas of evaluation. We focused on techniques used in measuring market progress, enhanced techniques in attributing net energy impacts, and examining omitted program effects, particularly net non-energy benefits. The research involved a detailed literature review, interviews with program managers and evaluators across the US, and refinements of techniques used by the authors in conducting evaluation work. The object of the research was to uncover successful (and unsuccessful) approaches being used for key aspects of evaluation work. The research uncovered areas of tracking that are becoming more commonly used by agencies to assess progress in the market. In addition, detailed research by the authors on a number of impact and attribution evaluations have also led to recommendations on key practices that we believe comprise elements of best practices for assessments of attributable program effects. Specifically, we have identified a number of useful steps to improve the attribution of impacts to program interventions. Information on techniques for both attribution/causality work for a number of programs are presented - including market transformation programs that rely on marketing, advertising, training, and mid-stream incentives and work primarily with a network of participating mid-market actors. The project methods and results are presented and include: Theory-based evaluation, indicators, and hypothesis testing; Enhanced measurement of free riders, spillover, and other effects, and attribution of impacts using distribution and ranges of measure and intervention impacts, rather than less reliable point estimates; Attribution of program-induced non-energy benefits; Net to gross, benefit cost analysis, and incorporation of scenario/risk analysis of results; Comparison of net to gross results across program types to explore patterns and

  4. Metal trace analysis by PIXE and PDMS techniques

    International Nuclear Information System (INIS)

    The risk for the human health due to exposure to aerosols depends on the intake pattern, the mass concentration and the speciation of the elements present in airborne particles. In this work plasma desorption mass spectrometry (PDMS) was used as complementary technique to the particle-induced X-ray emission (PIXE) technique to characterize aerosol samples collected in the environment. The PIXE technique allows the identification of the elements present in the sample and to determine their mass concentrations. The mass spectrometry (PDMS) was used to identify the speciation of these elements present in the samples. The aerosol samples were collected using a 6-stage cascade impactor (CI) in two sites of Rio de Janeiro City. One is an island (Fundao Island) in the Guanabara Bay close to an industrial zone and the other, in Gavea, is a residential zone close to a lagoon and to the seashore. The mass median aerodynamic diameter (MMAD) measured indicated that the airborne particulates were in the fine fraction of the aerosols collected in both locations. In order to identify the contribution of the seawater particles from the Guanabara Bay in the aerosols, seawater samples were also collected at Fundao Island. The samples were analyzed by PIXE and PDMS techniques. The analysis of the results suggests that the aerosols are different in both sampling sites and also exist a contribution from the Guanabara Bay seawater particles to the aerosols collected in the Fundao Island. PIXE allows identification and quantification of the elements heavier than Na (Z=11) while PDMS allows identification of organic and inorganic compounds present in the samples, as these techniques are used as complementary techniques they provide important information about the aerosols characterization

  5. Privacy-Preserving Data Analysis Techniques by using different modules

    Directory of Open Access Journals (Sweden)

    Payal P. Wasankar

    2013-11-01

    Full Text Available The competing parties who have private data may collaboratively conduct privacy preserving distributed data analysis (PPDA tasks to learn beneficial data models or analysis results. For example, different credit card companies may try to build better models for credit card fraud detection through PPDA tasks. Similarly, competing companies in the same industry may try to combine their sales data to build models that may predict the future sales. In many of these cases, the competing parties have different incentives. Although certain PPDA techniques guarantee that nothing other than the final analysis result is revealed, it is impossible to verify whether or not participating parties are truthful about their private input data.

  6. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  7. Summary of recent AAEC research on noise analysis techniques

    International Nuclear Information System (INIS)

    The research establishment of the AAEC has, over the last decade, developed a comprehensive data analysis facility capable of dealing with random signals in the range 0 to 300 k Hz. All the conventional spectral and correlation functions can be estimated from either analogue or digital signals. This facility together with a wide range of available sensors have been used to detect record and analysis data derived from real and simulated nuclear plant. Although the main emphasis of the work has been to develop an experimental capability and acquire basic skills in noise analysis techniques, some application has been made to real life practical problems. The following is a brief summary of work carried out at Lucas Heights during the period 1977 to 1980. The comments are intentionally concise as reference to detailed papers are given

  8. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  9. Monitoring the progress of build-up formation in fatty alcohol plant pipelines using gamma-ray scanning techniques

    International Nuclear Information System (INIS)

    A study was conducted to monitor the progress of material build-up formation in fatty acid alcohol pipelines using gamma ray absorption techniques. The investigation was periodically performed at few selected location which has been defined as critical area. Before performing a scan, the intensity of the gamma ray as a reference at the clean pipe should be determined. From the gamma ray absorption principle, the intensity of the radiation initial and the radiation after it pass through a material should be different, so the thickness of the build-up in the pipeline can be determined. As a result, base on this early information of the actual condition of the build-up formation, the more effective maintenance schedule can be planned. From that, the maintenance cost which is due to the build-up formation could be minimise as low as possible. (Author)

  10. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  11. Soil Data Analysis Using Classification Techniques and Soil Attribute Prediction

    Directory of Open Access Journals (Sweden)

    Jay Gholap

    2012-05-01

    Full Text Available Agricultural research has been profited by technical advances such as automation, data mining. Today ,data mining is used in a vast areas and many off-the-shelf data mining system products and domain specific data mining application soft wares are available, but data mining in agricultural soil datasets is a relatively a young research field. The large amounts of data that are nowadays virtually harvested along with the crops have to be analyzed and should be used to their full extent. This research aims at analysis of soil dataset using data mining techniques. It focuses on classification of soil using various algorithms available. Another important purpose is to predict untested attributes using regression technique, and implementation of automated soil sample classification.

  12. Comparative Analysis of Partial Occlusion Using Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    N.Nallammal

    2013-04-01

    Full Text Available This paper presents a comparison of partial occlusion using face recognition techniques that gives in which technique produce better result for total success rate. The partial occlusion of face recognition is especially useful for people where part of their face is scarred and defect thus need to be covered. Hence, either top part/eye region or bottom part of face will be recognized respectively. The partial face information are tested with Principle Component Analysis (PCA, Non-negative matrix factorization (NMF, Local NMF (LNMF and Spatially Confined NMF (SFNMF. The comparative results show that the recognition rate of 95.17% with r = 80 by using SFNMF for bottom face region. On the other hand, eye region achieves 95.12% with r = 10 by using LNMF.

  13. Gamma absorption technique in elemental analysis of composite materials

    International Nuclear Information System (INIS)

    Highlights: ► Application of gamma-ray absorption technique in elemental analysis. ► Determination of elemental composition of some bronze and gold alloys. ► Determination of some heavy elements in water. - Abstract: Expressions for calculating the elemental concentrations of composite materials based on a gamma absorption technique are derived. These expressions provide quantitative information about elemental concentrations of materials. Calculations are carried out for estimating the concentrations of copper and gold in some alloys of bronze and gold. The method was also applied for estimating the concentrations of some heavy elements in a water matrix highlighting the differences with photon attenuation measurements. Theoretical mass attenuation coefficient values were obtained using the WinXCom program. A high-resolution gamma-ray spectrometry based on high purity germanium detector (HPGe) was employed to measure the attenuation of a strongly collimated monoenergetic gamma beam through samples.

  14. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  15. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author)

  16. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  17. Post Buckling Progressive Failure Analysis of Composite Laminated Stiffened Panels

    Science.gov (United States)

    Anyfantis, Konstantinos N.; Tsouvalis, Nicholas G.

    2012-06-01

    The present work deals with the numerical prediction of the post buckling progressive and final failure response of stiffened composite panels based on structural nonlinear finite element methods. For this purpose, a progressive failure model (PFM) is developed and applied to predict the behaviour of an experimentally tested blade-stiffened panel found in the literature. Failure initiation and propagation is calculated, owing to the accumulation of the intralaminar failure modes induced in fibre reinforced composite materials. Hashin failure criteria have been employed in order to address the fiber and matrix failure modes in compression and tension. On the other hand, the Tsai-Wu failure criterion has been utilized for addressing shear failure. Failure detection is followed with the introduction of corresponding material degradation rules depending on the individual failure mechanisms. Failure initiation and failure propagation as well as the post buckling ultimate attained load have been numerically evaluated. Final failure behaviour of the simulated stiffened panel is due to sudden global failure, as concluded from comparisons between numerical and experimental results being in good agreement.

  18. An Archetypal Analysis on The Pilgrim’s Progress

    Institute of Scientific and Technical Information of China (English)

    杨洛琪

    2014-01-01

    John Bunyan (1628-1688) is one of the most remarkable figures in 17th century English literature.He is famous for his authorship of The Pilgrim’s Progress and becomes one of the world’s most widely-read Christian writers.This thesis attempts to use the archetypal theories to analyze the archetypes on Christian culture in The Pilgrim’s Progress.According to the theory of archetype, Bunyan’s use of biblical images and themes can be called archetypes.Therefore, this thesis tries to explore the underlying archetypal elements so as to represent its literary treasures by resorting to the theory of archetypal criticism.%约翰·班扬是十七世纪英国文学史上最伟大的作家之一,其著作《天路历程》让他成为最受欢迎的基督教作家。本文从原型批评理论的角度阐释了《天路历程》中来自基督教的文化原型。这些原型涉及作品的圣经意象:水和主题两个方面。从原型角度研究了《天路历程》的根据及重要性。

  19. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  20. Elimination and adsorptive transfer techniques in an oligonucleotide analysis

    Czech Academy of Sciences Publication Activity Database

    Jelen, František; Trnková, L.; Kouřilová, Alena; Kejnovská, Iva; Vorlíčková, Michaela

    Xi' an, 2009. P12. [International Symposium on Frontiers of Electrochemical Science and Technology . 12.08.2009-15.08.2009, Xi' an] R&D Projects: GA AV ČR(CZ) IAA400040804; GA AV ČR(CZ) IAA100040701; GA AV ČR(CZ) KAN200040651; GA MŠk(CZ) LC06035 Institutional research plan: CEZ:AV0Z50040507; CEZ:AV0Z50040702 Keywords : elimination voltammetry * transfer techniques * analysis of oligonucleotides Subject RIV: BO - Biophysics

  1. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  2. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  3. Acceleration of multivariate analysis techniques in TMVA using GPUs

    CERN Document Server

    Hoecker, A; Therhaag, J; Washbrook, A

    2012-01-01

    A feasibility study into the acceleration of multivariate analysis techniques using Graphics Processing Units (GPUs) will be presented. The MLP-based Artificial Neural Network method contained in the TMVA framework has been chosen as a focus for investigation. It was found that the network training time on a GPU was lower than for CPU execution as the complexity of the network was increased. In addition, multiple neural networks can be trained simultaneously on a GPU within the same time taken for single network training on a CPU. This could be potentially leveraged to provide a qualitative performance gain in data classification.

  4. Multi-element study in aluminium by activation analysis technique

    International Nuclear Information System (INIS)

    The instrumental activation analysis is a technique relatively quickly that help to know the elemental composition of materials. It is used mainly in the trace elements determination but in the case of major elements it is necessary to make some considerations as the different nuclear reactions carried out due to the neutron flux is a mixture of thermal and fast neutrons. This could be interpreted for the presence and or erroneous quantification about some elements. In this work, is described the way in which was analyzed a container piece with approximately a 85% of aluminium. The elements Zn, Mn, Sb, Ga, Cu, Cl and Sm were determined. (Author)

  5. Techniques for Improving Filters in Power Grid Contingency Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Adolf, Robert D.; Haglin, David J.; Halappanavar, Mahantesh; Chen, Yousu; Huang, Zhenyu

    2011-12-31

    In large-scale power transmission systems, predicting faults and preemptively taking corrective action to avoid them is essential to preventing rolling blackouts. The computational study of the constantly-shifting state of the power grid and its weaknesses is called contingency analysis. Multiple-contingency planning in the electrical grid is one example of a complex monitoring system where a full computational solution is operationally infeasible. We present a general framework for building and evaluating resource-aware models of filtering techniques for this type of monitoring.

  6. Meta-analysis in Stata: history, progress and prospects

    OpenAIRE

    Jonathan Sterne

    2004-01-01

    Systematic reviews of randomised trials are now widely recognised to be the best way to summarise the evidence on the effects of medical interventions. A systematic review may (though it need not) contain a meta-analysis, `a statistical analysis which combines the results of several independent studies considered by the analyst to be "combinable" '. The first researcher to do a meta-analysis was probably Karl Pearson, in 1904. Sadly, Stata was not available at this time. The first Stata comma...

  7. Experimental and Automated Analysis Techniques for High-resolution Electrical Mapping of Small Intestine Slow Wave Activity

    OpenAIRE

    Angeli, Timothy R.; O'Grady, Gregory; Paskaranandavadivel, Niranchan; Jonathan C Erickson; Du, Peng; Pullan, Andrew J; Bissett, Ian P.; Cheng, Leo K

    2013-01-01

    Background/Aims Small intestine motility is governed by an electrical slow wave activity, and abnormal slow wave events have been associated with intestinal dysmotility. High-resolution (HR) techniques are necessary to analyze slow wave propagation, but progress has been limited by few available electrode options and laborious manual analysis. This study presents novel methods for in vivo HR mapping of small intestine slow wave activity. Methods Recordings were obtained from along the porcine...

  8. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  9. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  10. Validation of Design and Analysis Techniques of Tailored Composite Structures

    Science.gov (United States)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  11. Envelopment technique and topographic overlays in bite mark analysis

    Science.gov (United States)

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    Aims and Objectives: The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Materials and Methods: Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. Results: The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Conclusion: Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05). PMID:26816458

  12. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Directory of Open Access Journals (Sweden)

    Amin Torabipour

    2014-11-01

    Full Text Available This study aimed to measure the hospital productivity using data envelopment analysis (DEA technique and Malmquist indices.This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software.Six hospitals (50% had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05 (except in 2009 years.Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  13. Homogenization techniques for the analysis of porous SMA

    Science.gov (United States)

    Sepe, V.; Auricchio, F.; Marfia, S.; Sacco, E.

    2016-05-01

    In this paper the mechanical response of porous Shape Memory Alloy (SMA) is modeled. The porous SMA is considered as a composite medium made of a dense SMA matrix with voids treated as inclusions. The overall response of this very special composite is deduced performing a micromechanical and homogenization analysis. In particular, the incremental Mori-Tanaka averaging scheme is provided; then, the Transformation Field Analysis procedure in its uniform and nonuniform approaches, UTFA and NUTFA respectively, are presented. In particular, the extension of the NUTFA technique proposed by Sepe et al. (Int J Solids Struct 50:725-742, 2013) is presented to investigate the response of porous SMA characterized by closed and open porosity. A detailed comparison between the outcomes provided by the Mori-Tanaka, the UTFA and the proposed NUTFA procedures for porous SMA is presented, through numerical examples for two- and three-dimensional problems. In particular, several values of porosity and different loading conditions, inducing pseudoelastic effect in the SMA matrix, are investigated. The predictions assessed by the Mori-Tanaka, the UTFA and the NUTFA techniques are compared with the results obtained by nonlinear finite element analyses. A comparison with experimental data available in literature is also presented.

  14. Use of statistical techniques in analysis of biological data

    Directory of Open Access Journals (Sweden)

    Farzana Perveen

    2012-07-01

    Full Text Available Starting from the ancient age to the modern times not a single area can be found where statistics is not playing a vital role. Statistics has now been recognized and universally accepted as an essential component of research in every branch of science. Starting from agriculture, biology, education, economics, business, management, medical, engineering, psychology, environment and space, statistics is playing significant role. Statistics is being extensively used in biological sciences. Specifically, biostatistics is the branch of applied statistics that concerns the application of statistical methods to medical, genetics and biological problems. In the sequel, one important step is the appropriate and careful analysis of statistical data to get precise results. It is pertinent to mention that majority of statistical tests and techniques are applied under certain mathematical assumptions. Therefore, it is necessary to realize the importance of relevant assumptions. In this connection, among other assumptions, the assumption of normality (normal distribution of population(s and variance homogeneity etc. are the most important. If these assumptions are not satisfied, the results may be potentially misleading. It is, therefore, suggested to check the relevant assumption(s about the data before applying statistical test(s to get valid results. In this study, a few techniques/tests have been described for checking the normality of a given set of data. Since the Analysis of variance (ANOVA models are extensively used in biological research, therefore, the assumptions underlying the ANOVA have also been discussed. Non-parametric statistics is also described to some extent.

  15. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  16. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  17. Temporal analysis of elite men’s discus throwing technique.

    Directory of Open Access Journals (Sweden)

    Vassilios Panoutsakopoulos

    2013-02-01

    Full Text Available The purpose of this study was to investigate the relationship between the duration of the throw and the official throwing distance in a group of elite male discus throwers. The time analysis of the technique phases (i.e. preparation, entry, flight, transition, delivery, release of the participants in a top international athletics competition was used in order to conduct the study. Data were retrieved after recording seven right-handed throwers (age: 28.8 ± 4.1 years, body height: 1.94 ± 0.09 m, body mass: 119.4 ± 11.6 kg with a Casio EX-FX1 (Casio Computer Co. Ltd digital video camera (sampling frequency: 300fps and analyzing the captured throws with the V1 Home 2.02.54 software (Interactive Frontiers Inc.. The relationships among the duration of the technique phases of the throw and the official throwing distance were examined with Pearson Correlation Analysis using the SPSS 10.0.1 software (SPSS Inc.. Results revealed that no significant correlation (p > 0.05 existed among the average official throwing distance (63.04 ± 6.09 m and the duration of the discus throw or the duration of each technique phase. The temporal and correlation analyses were in agreement with previous studies. The dominant style of release was the release with no support on the ground. The majority of the throwers spent a larger percentage of the delivery turn (transition, delivery and release phases being in single than in double support. It was noted that a short duration of the transition phase, combined with lower values of the ratio of the time spent for the starting turn compared to the time spent for the delivery turn might be favorable regarding the achievement of a larger throwing distance.

  18. Analysis for the Progressive Failure Response of Textile Composite Fuselage Frames

    Science.gov (United States)

    Johnson, Eric R.; Boitnott, Richard L. (Technical Monitor)

    2002-01-01

    A part of aviation accident mitigation is a crashworthy airframe structure, and an important measure of merit for a crashworthy structure is the amount of kinetic energy that can be absorbed in the crush of the structure. Prediction of the energy absorbed from finite element analyses requires modeling the progressive failure sequence. Progressive failure modes may include material degradation, fracture and crack growth, and buckling and collapse. The design of crashworthy airframe components will benefit from progressive failure analyses that have been validated by tests. The subject of this research is the development of a progressive failure analysis for a textile composite, circumferential fuselage frame subjected to a quasi-static, crash-type load. The test data for the frame are reported, and these data are used to develop and to validate methods for the progressive failure response.

  19. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  20. Comparison of gas chromatographic hyphenated techniques for mercury speciation analysis.

    Science.gov (United States)

    Nevado, J J Berzas; Martín-Doimeadios, R C Rodríguez; Krupp, E M; Bernardo, F J Guzmán; Fariñas, N Rodríguez; Moreno, M Jiménez; Wallace, D; Ropero, M J Patiño

    2011-07-15

    In this study, we evaluate advantages and disadvantages of three hyphenated techniques for mercury speciation analysis in different sample matrices using gas chromatography (GC) with mass spectrometry (GC-MS), inductively coupled plasma mass spectrometry (GC-ICP-MS) and pyrolysis atomic fluorescence (GC-pyro-AFS) detection. Aqueous ethylation with NaBEt(4) was required in all cases. All systems were validated with respect to precision, with repeatability and reproducibility TMAH). No statistically significant differences were found to the certified values (p=0.05). The suitability for water samples analysis with different organic matter and chloride contents was evaluated by recovery experiments in synthetic spiked waters. Absolute detection and quantification limits were in the range of 2-6 pg for GC-pyro-AFS, 1-4 pg for GC-MS, with 0.05-0.21 pg for GC-ICP-MS showing the best limits of detection for the three systems employed. However, all systems are sufficiently sensitive for mercury speciation in environmental samples, with GC-MS and GC-ICP-MS offering isotope analysis capabilities for the use of species-specific isotope dilution analysis, and GC-pyro-AFS being the most cost effective alternative. PMID:21641604

  1. Progress and challenges in the development and qualification of multi-level multi-physics coupled methodologies for reactor analysis

    International Nuclear Information System (INIS)

    Current trends in nuclear power generation and regulation as well as the design of next generation reactor concepts along with the continuing computer technology progress stimulate the development, qualification and application of multi-physics multi-scale coupled code systems. The efforts have been focused on extending the analysis capabilities by coupling models, which simulate different phenomena or system components, as well as on refining the scale and level of detail of the coupling. This paper reviews the progress made in this area and outlines the remaining challenges. The discussion is illustrated with examples based on neutronics/thermohydraulics coupling in the reactor core modeling. In both fields recent advances and developments are towards more physics-based high-fidelity simulations, which require implementation of improved and flexible coupling methodologies. First, the progresses in coupling of different physics codes along with the advances in multi-level techniques for coupled code simulations are discussed. Second, the issues related to the consistent qualification of coupled multi-physics and multi-scale code systems for design and safety evaluation are presented. The increased importance of uncertainty and sensitivity analysis are discussed along with approaches to propagate the uncertainty quantification between the codes. The incoming OECD LWR Uncertainty Analysis in Modeling (UAM) benchmark is the first international activity to address this issue and it is described in the paper. Finally, the remaining challenges with multi-physics coupling are outlined. (authors)

  2. High-level power analysis and optimization techniques

    Science.gov (United States)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  3. Assessment of progressive deformation on the basis of elastic analysis

    International Nuclear Information System (INIS)

    The behaviour of structures subjected to cyclic loading is complex. The structure may be in inelastic or plastic shakedown state or exhibit the ratchetting phenomenon. For reasons related to operation (functional play), geometric instability (buckling) and damage, it is important to estimate the maximum deformation reached on the structure when 'it stabilizes. A proposed solution to this problem is offered by the rule of the efficiency diagram based on a set of experimental results but, in certain cases, this method is impossible or difficult to apply. In this paper, we propose a general theoretical approach to the efficiency diagram and this will allow us to extend its field of application to cases of structures subjected to null primary loading. For this purpose, we demonstrate that, in certain cases, there is a coupling between primary and secondary loading. A new definition of primary stress, identified with the former definition in simple cases, is proposed. Finally, we will apply this method to structures bitubes and shells at free level, under thermomechanical loading and, therefore, generating secondary stresses liable to work in progressive deformation mode

  4. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample. PMID:27030469

  5. Demonstration Technology Application and Analysis on the Scientific and Technological Progress

    OpenAIRE

    Qingzhu Qi; Zhixiao Jiang

    2013-01-01

    This paper takes Tianjin for example and analyzes the development tend of scientific and technological progress in Tianjin. From five aspects as ‘environment of scientific and technological progress’, ‘input of scientific and technological activities’, ‘output of scientific and technological activities’, ‘high-tech industrialization’, ‘science and technology for economic and social development’, the paper analysis the correlation between GDP and scientific and technological progress. Research...

  6. Analysis of genetic copy number changes in cervical disease progression

    International Nuclear Information System (INIS)

    Cervical dysplasia and tumorigenesis have been linked with numerous chromosomal aberrations. The goal of this study was to evaluate 35 genomic regions associated with cervical disease and to select those which were found to have the highest frequency of aberration for use as probes in fluorescent in-situ hybridization. The frequency of gains and losses using fluorescence in-situ hybridization were assessed in these 35 regions on 30 paraffin-embedded cervical biopsy specimens. Based on this assessment, 6 candidate fluorescently labeled probes (8q24, Xp22, 20q13, 3p14, 3q26, CEP15) were selected for additional testing on a set of 106 cervical biopsy specimens diagnosed as Normal, CIN1, CIN2, CIN3, and SCC. The data were analyzed on the basis of signal mean, % change of signal mean between histological categories, and % positivity. The study revealed that the chromosomal regions with the highest frequency of copy number gains and highest combined sensitivity and specificity in high-grade cervical disease were 8q24 and 3q26. The cytological application of these two probes was then evaluated on 118 ThinPrep™ samples diagnosed as Normal, ASCUS, LSIL, HSIL and Cancer to determine utility as a tool for less invasive screening. Using gains of either 8q24 or 3q26 as a positivity criterion yielded specificity (Normal +LSIL+ASCUS) of 81.0% and sensitivity (HSIL+Cancer) of 92.3% based on a threshold of 4 positive cells. The application of a FISH assay comprised of chromosomal probes 8q24 and 3q26 to cervical cytology specimens confirms the positive correlation between increasing dysplasia and copy gains and shows promise as a marker in cervical disease progression

  7. Progressive-Ratio Schedules and Applied Behavior Analysis

    Science.gov (United States)

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  8. Methods and Techniques for miRNA Data Analysis.

    Science.gov (United States)

    Cristiano, Francesca; Veltri, Pierangelo

    2016-01-01

    Genomic data analysis consists of techniques to analyze and extract information from genes. In particular, genome sequencing technologies allow to characterize genomic profiles and identify biomarkers and mutations that can be relevant for diagnosis and designing of clinical therapies. Studies often regard identification of genes related to inherited disorders, but recently mutations and phenotypes are considered both in diseases studies and drug designing as well as for biomarkers identification for early detection.Gene mutations are studied by comparing fold changes in a redundancy version of numeric and string representation of analyzed genes starting from macromolecules. This consists of studying often thousands of repetitions of gene representation and signatures identified by biological available instruments that starting from biological samples generate arrays of data representing nucleotides sequences representing known genes in an often not well-known sequence.High-performance platforms and optimized algorithms are required to manipulate gigabytes of raw data that are generated by the so far mentioned biological instruments, such as NGS (standing for Next-Generation Sequencing) as well as for microarray. Also, data analysis requires the use of several tools and databases that store gene targets as well as gene ontologies and gene-disease association.In this chapter we present an overview of available software platforms for genomic data analysis, as well as available databases with their query engines. PMID:26069024

  9. Analysis of Protein in Soybean by Neutron Activation Technique

    International Nuclear Information System (INIS)

    Nitrogen content in soybean was studied by using Neutron Activation Analysis technique through fast neutron at the flux of 2.5 * 1011 n/cm2. sec in the CA-3 out-core irradiation tube of the Thai Research Reactor-1/Modification 1 (TRR-1/M1, Triga Mark 3 type). By measuring gamma ray of 511 keV from 13N of the nuclear reaction, 14N(n, 2n)13N caused by the annihilation of positron disintegrated, the semi-conductor detector (HPGe) was connected with the multi-channel analyzer (MCA) and monitor to display the spectrum range. NH4NO3 was used as the standard for the analysis. The inaccuracy of the analysis caused by other radioisotopes, i.e. potassium, phosphorus and reaction from recoiled proton scattering in soybean was corrected. The data of 27 samples analyzed by neutron activation showed no significant difference in the nitrogen content. The average nitrogen content of all the soybean samples is 7.02% equivalent to protein content of 43.88%

  10. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  11. Nuclear fuel lattice performance analysis by data mining techniques

    International Nuclear Information System (INIS)

    Highlights: • This paper shows a data mining application to analyse nuclear fuel lattice designs. • Data mining methods were used to predict if fuel lattices could operate in an adequate way into the BWR reactor core. • Data mining methods learned from fuel lattice datasets simulated with SIMULATE-3. • Results show high recognition percentages of adequate or inadequate fuel lattice performance. - Abstract: In this paper a data mining analysis for BWR nuclear fuel lattice performance is shown. In a typical three-dimensional simulation of the reactor operation simulator gives the core performance for a fuel lattice configuration measured by thermal limits, shutdown margin and produced energy. Based on these results we can determine the number of fulfilled parameters of a fuel lattice configuration. It is interesting to establish a relationship between the fuel lattice properties and the number of fulfilled core parameters in steady state reactor operation. So, with this purpose data mining techniques were used. Results indicate that these techniques are able to predict with enough accuracy (greater than 75%) if a given fuel lattice configuration will have a either “good” or “bad” performance according to reactor core simulation. In this way, they could be coupled with an optimization process to discard fuel lattice configurations with poor performance and, in this way accelerates the optimization process. Data mining techniques apply some filter methods to discard those variables with lower influence in the number of core fulfilled parameter. From this situation, it was also possible to identify a set of variables to be used in new optimization codes with different objective functions than those normally used

  12. Techniques of production and analysis of polarized synchrotron radiation

    International Nuclear Information System (INIS)

    The use of the unique polarization properties of synchrotron radiation in the hard x-ray spectral region (E>3 KeV) is becoming increasingly important to many synchrotron radiation researchers. The radiation emitted from bending magnets and conventional (planar) insertion devices (IDs) is highly linearly polarized in the plane of the particle's orbit. Elliptically polarized x-rays can also be obtained by going off axis on a bending magnet source, albeit with considerable loss of flux. The polarization properties of synchrotron radiation can be further tailored to the researcher's specific needs through the use of specialized insertion devices such as helical and crossed undulators and asymmetrical wigglers. Even with the possibility of producing a specific polarization, there is still the need to develop x-ray optical components which can manipulate the polarization for both analysis and further modification of the polarization state. A survey of techniques for producing and analyzing both linear and circular polarized x-rays will be presented with emphasis on those techniques which rely on single crystal optical components

  13. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  14. Image analysis technique applied to lock-exchange gravity currents

    Science.gov (United States)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  15. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  16. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  17. Expert rowers’ motion analysis for synthesis and technique digitalization

    Directory of Open Access Journals (Sweden)

    Filippeschi Alessandro

    2011-12-01

    Full Text Available Four expert rowers’ gestures were gathered on the SPRINT rowing platform with the aid of an optic motion tracking system. Data were analyzed in order to get a digital representation of the features involved in rowing. Moreover, these data provide a dataset for developing digital models for rowing motion synthesis. Rowers were modeled as kinematic chains, data were processed in order to get position and orientation of upper body limbs. This representation was combined with SPRINT data in order to evaluate features found in the literature, to find new ones and to build models for the generation of rowing motion. The analysis shows the effectiveness of the motion reconstruction and two examples of technique features: stroke timing and upper limbs orientation during the finish phase.

  18. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  19. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  20. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  1. PROGRESS IN SIFT-MS: BREATH ANALYSIS AND OTHER APPLICATIONS

    Czech Academy of Sciences Publication Activity Database

    Španěl, Patrik; Smith, D.

    2011-01-01

    Roč. 30, č. 2 (2011), s. 236-267. ISSN 0277-7037 R&D Projects: GA MPO FT-TA4/124; GA ČR GA202/09/0800; GA ČR GA203/09/0256 Institutional research plan: CEZ:AV0Z40400503 Keywords : SIFT-MS * breath analysis * ion flow tube Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 10.461, year: 2011

  2. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  3. 1985. Annual progress report

    International Nuclear Information System (INIS)

    This annual progress report of the CEA Protection and Nuclear Safety Institut outlines a description of the progress made in each sections of the Institut Research activities of the different departments include: reactor safety analysis, fuel cycle facilities analysis; and associated safety research programs (criticality, sites, transport ...), radioecology and environmental radioprotection techniques; data acquisition on radioactive waste storage sites; radiation effects on man, studies on radioprotection techniques; nuclear material security including security of facilities, security of nuclear material transport, and monitoring of nuclear material management; nuclear facility decommissioning; and finally the public information

  4. Impact of advanced microstructural characterization techniques on modeling and analysis of radiation damage

    International Nuclear Information System (INIS)

    The evolution of radiation-induced alterations of dimensional and mechanical properties has been shown to be a direct and often predictable consequence of radiation-induced microstructural changes. Recent advances in understanding of the nature and role of each microstructural component in determining the property of interest has led to a reappraisal of the type and priority of data needed for further model development. This paper presents an overview of the types of modeling and analysis activities in progress, the insights that prompted these activities, and specific examples of successful and ongoing efforts. A review is presented of some problem areas that in the authors' opinion are not yet receiving sufficient attention and which may benefit from the application of advanced techniques of microstructural characterization. Guidelines based on experience gained in previous studies are also provided for acquisition of data in a form most applicable to modeling needs

  5. Progress in identifying a human ionizing-radiation repair gene using DNA-mediated gene transfer techniques

    International Nuclear Information System (INIS)

    The authors employing DNA-mediated gene transfer techniques in introducing human DNA into a DNA double-strand break (DSB) repair deficient Chinese hamster (CHO) cell mutant (xrs-6), which is hypersensitive to both X-rays (D0 = 0.39 Gy) and the antibiotic bleomycin (D0 = 0.01 μg/ml). High molecular weight DNA isolated from cultured human skin fibroblasts was partially digested with restriction enzyme Sau 3A to average sizes of 20 or 40 Kb, ligated with plasmid pSV2-gpt DNA, and transfected into xrs-6 cells. Colonies which developed under a bleomycin and MAX (mycophenolic acid/adenine/xanthine) double-selection procedure were isolated and further tested for X-ray sensitivity and DSB rejoining capacity. To date a total of six X-ray or bleomycin resistant transformants have been isolated. All express rejoining capacity for X-ray-induced DSB, similar to the rate observed for DSB repair in CHO wild type cells. DNA isolated from these primary transformants contain various copy numbers of pSV2-gpt DNA and also contain human DNA sequences as determined by Southern blot hybridization. Recently, a secondary transformant has been isolated using DNA from one of the primary transformants. Cellular and molecular characterization of this transformant is in progress. DNA from a genuine secondary transformant will be used in the construction of a DNA library to isolate human genomic DNA encoding this radiation repair gene

  6. ENEA initiatives in Southern Italy: Progress report, analysis, prospects

    International Nuclear Information System (INIS)

    In the past, technological development in Italy was concentrated in the country's heavily industrialized northern regions. The motive for this choice was the conception that to be successful in a highly competitive market, research investment had necessarily to favour those developed areas with an already proven capacity for guaranteed fast and high returns. Unfortunately this policy has created a technologically and economically depressed area, known as Mezzogiorno, in southern Italy. Within the framework of new national energy and economic policies calling for balanced economic and technological development, ENEA (Italian Commission for New Technologies, Energy and the Environment) has been entrusted with the planning and managing of research, commercialization and technology transfer programs designed to stimulate high-technology industrial activity in Italy's southern regions so as to allow them to become more competitive in the upcoming European free trade market. Small business concerns shall be favoured in this new development scheme which shall respect the existing local social-economic framework. Emphasis shall be placed on privileging such elements as quality, flexibility and versatility, as opposed to lost cost mass production. Priority is to be given to the development of renewable energy sources, energy conservation techniques and environmentally compatible technologies

  7. Progress on Radiochemical Analysis for Nuclear Waste Management in Decommissioning

    DEFF Research Database (Denmark)

    Hou, Xiaolin; Qiao, Jixin; Shi, Keliang;

    With the increaed numbers of nuclear facilities have been closed and are being or are going to be decommissioned, it is required to characterise the produced nuclear waste for its treatment by identification of the radionuclides and qualitatively determine them. Of the radionuclides related...... separation of radionuclides. In order to improve and maintain the Nodic competence in analysis of radionculides in waste samples, a NKS B project on this topic was launched in 2009. During the first phase of the NKS-B RadWaste project (2009-2010), a good achivement has been reached on establishment...

  8. Progress Toward Efficient Laminar Flow Analysis and Design

    Science.gov (United States)

    Campbell, Richard L.; Campbell, Matthew L.; Streit, Thomas

    2011-01-01

    A multi-fidelity system of computer codes for the analysis and design of vehicles having extensive areas of laminar flow is under development at the NASA Langley Research Center. The overall approach consists of the loose coupling of a flow solver, a transition prediction method and a design module using shell scripts, along with interface modules to prepare the input for each method. This approach allows the user to select the flow solver and transition prediction module, as well as run mode for each code, based on the fidelity most compatible with the problem and available resources. The design module can be any method that designs to a specified target pressure distribution. In addition to the interface modules, two new components have been developed: 1) an efficient, empirical transition prediction module (MATTC) that provides n-factor growth distributions without requiring boundary layer information; and 2) an automated target pressure generation code (ATPG) that develops a target pressure distribution that meets a variety of flow and geometry constraints. The ATPG code also includes empirical estimates of several drag components to allow the optimization of the target pressure distribution. The current system has been developed for the design of subsonic and transonic airfoils and wings, but may be extendable to other speed ranges and components. Several analysis and design examples are included to demonstrate the current capabilities of the system.

  9. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  10. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  11. New laser spectroscopic technique for stable-isotope ratio analysis

    International Nuclear Information System (INIS)

    Reliable and safe application of isotopes as tracers is important in many areas, including biomedical, environmental and geochronological sciences. A new approach to stable-isotope ratio analysis based on atomic hyperfine structure is demonstrated. This laser spectroscopic scheme is virtually interference-free because of the highly selective and specific nature of hyperfine structures. Hence, a minor constituent in a complex matrix can be selectively analyzed without extensive sample preparation. A single-frequency tunable cw ring dye laser is used as the excitation source and a specially designed and constructed demountable cathode discharge is used as the atomizer and detector. Samples are electrodeposited on the demountable cathode and hyperfine profiles are collected by optogalvanic detection. By spectral deconvolution, the relative abundances of all isotopes present can be determined with good accuracy and precision. The technique is demonstrated for copper concentrations as low as 1.6 ppm, using the atomic hyperfine structure of CuI 578.2 nm non-resonance transition. It is also successfully tested for analysis of copper isotopes in human blood

  12. Automated target recognition technique for image segmentation and scene analysis

    Science.gov (United States)

    Baumgart, Chris W.; Ciarcia, Christopher A.

    1994-03-01

    Automated target recognition (ATR) software has been designed to perform image segmentation and scene analysis. Specifically, this software was developed as a package for the Army's Minefield and Reconnaissance and Detector (MIRADOR) program. MIRADOR is an on/off road, remote control, multisensor system designed to detect buried and surface- emplaced metallic and nonmetallic antitank mines. The basic requirements for this ATR software were the following: (1) an ability to separate target objects from the background in low signal-noise conditions; (2) an ability to handle a relatively high dynamic range in imaging light levels; (3) the ability to compensate for or remove light source effects such as shadows; and (4) the ability to identify target objects as mines. The image segmentation and target evaluation was performed using an integrated and parallel processing approach. Three basic techniques (texture analysis, edge enhancement, and contrast enhancement) were used collectively to extract all potential mine target shapes from the basic image. Target evaluation was then performed using a combination of size, geometrical, and fractal characteristics, which resulted in a calculated probability for each target shape. Overall results with this algorithm were quite good, though there is a tradeoff between detection confidence and the number of false alarms. This technology also has applications in the areas of hazardous waste site remediation, archaeology, and law enforcement.

  13. Quick analysis techniques for assay of radioactivity in urine

    International Nuclear Information System (INIS)

    The needs for bioassay has recently increased in the radiation protection management at the nuclear power plant. In the practical application of bioassay, it is desirable to simplify pre-treatment procedures and to delete chemical separation treatments before the radiation measurement with a low-background liquid scintillation system. This paper presents the results on accumulation of background data of radioactivities in urine and assessment of the availability of a quick analysis method for urine-bioassay. The major results obtained are as follows: (1) The background concentration of 3H in human urine, which varies in both time and persons, is 50 - 240 pCi/l and equivalent to those in natural water. (2) A small quantity of non-treatment urine of 2 ml is sufficient to estimate the 3H concentration over the screening level in monitoring of the internal radiation exposure. (3) The average concentrations of 40K and 137Cs in human urine measured with a Ge detector are 2500 and 8 pCi/l, respectively, and the ratio of 137Cs to K is 3.14 pCi/g, which is applicable to determination of abnormal intake of 137Cs. (4) A simplified bioassay method using the quick analysis technique for non-treatment urine is proposed for monitoring the internal radiation exposure at the nuclear power plant. (author)

  14. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  15. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    Science.gov (United States)

    Lungaroni, M.; Murari, A.; Peluso, E.; Gelfusa, M.; Malizia, A.; Vega, J.; Talebzadeh, S.; Gaudio, P.

    2016-04-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  16. Mass spectrometry based imaging techniques for spatially resolved analysis of molecules

    Directory of Open Access Journals (Sweden)

    Andrea eMatros

    2013-04-01

    Full Text Available Higher plants are composed of a multitude of tissues with specific functions, reflected by distinct profiles for transcripts, proteins and metabolites. Comprehensive analysis of metabolites and proteins has advanced tremendously within recent years, and this progress has been driven by the rapid development of sophisticated mass spectrometrical techniques. In most of the current omics-studies, analysis is performed on whole organ or whole plant extracts, rendering to the loss of spatial information. Mass spectrometry based imaging (MSI techniques have opened a new avenue to obtain information on the spatial distribution of metabolites and of proteins. Pioneered in the field of medicine, the approaches are now applied to study the spatial profiles of molecules in plant systems. A range of different plant organs and tissues have been successfully analyzed by MSI, and patterns of various classes of metabolites from primary and secondary metabolism could be obtained. It can be envisaged that MSI approaches will substantially contribute to build spatially resolved biochemical networks.

  17. Evaluation of Progressive Failure Analysis and Modeling of Impact Damage in Composite Pressure Vessels

    Science.gov (United States)

    Sanchez, Christopher M.

    2011-01-01

    NASA White Sands Test Facility (WSTF) is leading an evaluation effort in advanced destructive and nondestructive testing of composite pressure vessels and structures. WSTF is using progressive finite element analysis methods for test design and for confirmation of composite pressure vessel performance. Using composite finite element analysis models and failure theories tested in the World-Wide Failure Exercise, WSTF is able to estimate the static strength of composite pressure vessels. Additionally, test and evaluation on composites that have been impact damaged is in progress so that models can be developed to estimate damage tolerance and the degradation in static strength.

  18. A dynamic mechanical analysis technique for porous media

    Science.gov (United States)

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  19. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40th ∼ 50th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  20. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  1. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  2. Accident progression event tree analysis for postulated severe accidents at N Reactor

    International Nuclear Information System (INIS)

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied

  3. Progression Analysis and Stage Discovery in Continuous Physiological Processes Using Image Computing

    Directory of Open Access Journals (Sweden)

    Ferrucci Luigi

    2010-01-01

    Full Text Available We propose an image computing-based method for quantitative analysis of continuous physiological processes that can be sensed by medical imaging and demonstrate its application to the analysis of morphological alterations of the bone structure, which correlate with the progression of osteoarthritis (OA. The purpose of the analysis is to quantitatively estimate OA progression in a fashion that can assist in understanding the pathophysiology of the disease. Ultimately, the texture analysis will be able to provide an alternative OA scoring method, which can potentially reflect the progression of the disease in a more direct fashion compared to the existing clinically utilized classification schemes based on radiology. This method can be useful not just for studying the nature of OA, but also for developing and testing the effect of drugs and treatments. While in this paper we demonstrate the application of the method to osteoarthritis, its generality makes it suitable for the analysis of other progressive clinical conditions that can be diagnosed and prognosed by using medical imaging.

  4. Glaucoma Progression Detection by Retinal Nerve Fiber Layer Measurement Using Scanning Laser Polarimetry: Event and Trend Analysis

    OpenAIRE

    Moon, Byung Gil; Sung, Kyung Rim; Cho, Jung Woo; Kang, Sung Yong; Yun, Sung-Cheol; Na, Jung Hwa; Lee, Youngrok; Kook, Michael S.

    2012-01-01

    Purpose To evaluate the use of scanning laser polarimetry (SLP, GDx VCC) to measure the retinal nerve fiber layer (RNFL) thickness in order to evaluate the progression of glaucoma. Methods Test-retest measurement variability was determined in 47 glaucomatous eyes. One eye each from 152 glaucomatous patients with at least 4 years of follow-up was enrolled. Visual field (VF) loss progression was determined by both event analysis (EA, Humphrey guided progression analysis) and trend analysis (TA,...

  5. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    International Nuclear Information System (INIS)

    This final report summarizes the accomplishments of a two year research project entitled ''Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed

  6. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  7. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled ``Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  8. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  9. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  10. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente;

    2005-01-01

    aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection......Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four...... limits were comparable to values obtained by HPLC. Detection limits were better than 0.4 mu g Se L-1. A urine sample was analysed on a 1.0 id x 100 mm column within 5 min using a flow rate of 100 mu L min(-1). The improved separation efficiency, owing to the use of 1.7 mu m column particles, allowed the...

  11. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Two brazing alloy samples (C P2 and C P3) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 1011 n/cm2/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 1012 n/cm2/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  12. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag+, Ba2+, and Cd2+ in the concentration range from 10 ng/g to 1 μg/g; for Cu2+ and Pb2+ from 10 ng/g to 5 μg/g; and for Hg2+ from 10 ng/g to 10 μg/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 μg/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag+, Ba2+, Cd2+, Cu2+, and Pb2+, share common binding sites with binding efficiencies varying in the sequence of Pb2+>Cu2+>Ag2+>Cd2+>Ba2+. The binding of Hg2+ involved a different binding site with an increase in binding efficiency in the presence of Ag+. (orig.)

  13. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  14. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  15. Comparative Analysis of Different LIDAR System Calibration Techniques

    Science.gov (United States)

    Miller, M.; Habib, A.

    2016-06-01

    With light detection and ranging (LiDAR) now being a crucial tool for engineering products and on the fly spatial analysis, it is necessary for the user community to have standardized calibration methods. The three methods in this study were developed and proven by the Digital Photogrammetry Research Group (DPRG) for airborne LiDAR systems and are as follows; Simplified, Quasi-Rigorous, and Rigorous. In lieu of using expensive control surfaces for calibration, these methods compare overlapping LiDAR strips to estimate the systematic errors. These systematic errors are quantified by these methods and include the lever arm biases, boresight biases, range bias and scan angle scale bias. These three methods comprehensively represent all of the possible flight configurations and data availability and this paper will test the limits of the method with the most assumptions, the simplified calibration, by using data that violates the assumptions it's math model is based on and compares the results to the quasi-rigorous and rigorous techniques. The overarching goal is to provide a LiDAR system calibration that does not require raw measurements which can be carried out with minimal control and flight lines to reduce costs. This testing is unique because the terrain used for calibration does not contain gable roofs, all other LiDAR system calibration testing and development has been done with terrain containing features with high geometric integrity such as gable roofs.

  16. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  17. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  18. An analysis of spectral transformation techniques on graphs

    Science.gov (United States)

    Djurović, Igor; Sejdić, Ervin; Bulatović, Nikola; Simeunović, Marko

    2015-05-01

    Emerging methods for the spectral analysis of graphs are analyzed in this paper, as graphs are currently used to study interactions in many fields from neuroscience to social networks. There are two main approaches related to the spectral transformation of graphs. The first approach is based on the Laplacian matrix. The graph Fourier transform is defined as an expansion of a graph signal in terms of eigenfunctions of the graph Laplacian. The calculated eigenvalues carry the notion of frequency of graph signals. The second approach is based on the graph weighted adjacency matrix, as it expands the graph signal into a basis of eigenvectors of the adjacency matrix instead of the graph Laplacian. Here, the notion of frequency is then obtained from the eigenvalues of the adjacency matrix or its Jordan decomposition. In this paper, advantages and drawbacks of both approaches are examined. Potential challenges and improvements to graph spectral processing methods are considered as well as the generalization of graph processing techniques in the spectral domain. Its generalization to the time-frequency domain and other potential extensions of classical signal processing concepts to graph datasets are also considered. Lastly, it is given an overview of the compressive sensing on graphs concepts.

  19. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  20. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    Energy Technology Data Exchange (ETDEWEB)

    Amato, G. [ISTI-CNR, Area della Ricerca, Via Moruzzi 1, 56124, Pisa (Italy); Cristoforetti, G.; Legnaioli, S.; Lorenzetti, G.; Palleschi, V. [IPCF-CNR, Area della Ricerca, Via Moruzzi 1, 56124, Pisa (Italy); Sorrentino, F., E-mail: sorrentino@fi.infn.i [Dipartimento di Fisica e astronomia, Universita di Firenze, Polo Scientifico, via Sansone 1, 50019 Sesto Fiorentino (Italy); Istituto di Cibernetica CNR, via Campi Flegrei 34, 80078 Pozzuoli (Italy); Marwan Technology, c/o Dipartimento di Fisica ' E. Fermi' , Largo Pontecorvo 3, 56127 Pisa (Italy); Tognoni, E. [INO-CNR, Area della Ricerca, Via Moruzzi 1, 56124 Pisa (Italy)

    2010-08-15

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  1. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    Science.gov (United States)

    Amato, G.; Cristoforetti, G.; Legnaioli, S.; Lorenzetti, G.; Palleschi, V.; Sorrentino, F.; Tognoni, E.

    2010-08-01

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  2. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    International Nuclear Information System (INIS)

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  3. Analysis of Jugular Foramen Exposure in the Fallopian Bridge Technique

    OpenAIRE

    Satar, Bulent; Yazar, Fatih; Aykut CEYHAN; Arslan, Hasan Huseyin; Aydin, Sedat

    2009-01-01

    Objective: To analyze the exposure of the jugular foramen afforded by the fallopian bridge technique. Method: The jugular foramen exposure was obtained using the jugular foramen approach combined with the fallopian bridge technique. We applied this technique using 10 temporal bone specimens at a tertiary referral center. The exposure was assessed by means of depth of the dissection field and two separate dissection spaces that were created anteriorly and posteriorly to the facial nerve. Anter...

  4. Conformational Analysis of Misfolded Protein Aggregation by FRET and Live-Cell Imaging Techniques

    Directory of Open Access Journals (Sweden)

    Akira Kitamura

    2015-03-01

    Full Text Available Cellular homeostasis is maintained by several types of protein machinery, including molecular chaperones and proteolysis systems. Dysregulation of the proteome disrupts homeostasis in cells, tissues, and the organism as a whole, and has been hypothesized to cause neurodegenerative disorders, including amyotrophic lateral sclerosis (ALS and Huntington’s disease (HD. A hallmark of neurodegenerative disorders is formation of ubiquitin-positive inclusion bodies in neurons, suggesting that the aggregation process of misfolded proteins changes during disease progression. Hence, high-throughput determination of soluble oligomers during the aggregation process, as well as the conformation of sequestered proteins in inclusion bodies, is essential for elucidation of physiological regulation mechanism and drug discovery in this field. To elucidate the interaction, accumulation, and conformation of aggregation-prone proteins, in situ spectroscopic imaging techniques, such as Förster/fluorescence resonance energy transfer (FRET, fluorescence correlation spectroscopy (FCS, and bimolecular fluorescence complementation (BiFC have been employed. Here, we summarize recent reports in which these techniques were applied to the analysis of aggregation-prone proteins (in particular their dimerization, interactions, and conformational changes, and describe several fluorescent indicators used for real-time observation of physiological states related to proteostasis.

  5. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  6. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  7. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  8. Analysis of a proposed Compton backscatter imaging technique

    Science.gov (United States)

    Hall, James M.; Jacoby, Barry A.

    1994-03-01

    One-sided imaging techniques are currently being used in nondestructive evaluation of surfaces and shallow subsurface structures. In this work we present both analytical calculations and detailed Monte Carlo simulations aimed at assessing the capability of a proposed Compton backscattering imaging technique designed to detect and characterize voids located several centimeters below the surface of a solid.

  9. A Technique for the Analysis of Auto Exhaust.

    Science.gov (United States)

    Sothern, Ray D.; And Others

    Developed for presentation at the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971, this outline explains a technique for separating the complex mixture of hydrocarbons contained in automotive exhausts. A Golay column and subambient temperature programming technique are…

  10. Cepstrum Analysis: An Advanced Technique in Vibration Analysis of Defects in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    M. Satyam

    1994-01-01

    Full Text Available Conventional frequency analysis in machinery vibration is not adequate to find out accurately defects in gears, bearings, and blades where sidebands and harmonics are present. Also such an approach is dependent on the transmission path. On the other hand, cepstrum analysis accurately identifies harmonics and sideband families and is a better technique available for fault diagnosis in gears, bearings, and turbine blades of ships and submarines. Cepstrum represents the global power content of a whole family of harmonics and sidebands when more than one family of sidebands are presents at the same time. Also it is insensitive to the transmission path effects since source and transmission path effects are additive and can be separated in cepstrum. The concept, underlying theory and the measurement and analysis involved for using the technique are briefly outlined. Two cases were taken to demonstrate advantage of cepstrum technique over the spectrum analysis. An LP compressor was chosen to study the transmission path effects and a marine gearbox having two sets of sideband families was studied to diagnose the problematic sideband and its severity.

  11. Analysis Of Machine Learning Techniques By Using Blogger Data

    OpenAIRE

    Gowsalya.R,; S. Veni

    2014-01-01

    Blogs are the recent fast progressing media which depends on information system and technological advancement. The mass media is not much developed for the developing countries are in government terms and their schemes are developed based on governmental concepts, so blogs are provided for knowledge and ideas sharing. This article has highlighted and performed simulations from obtained information, 100 instances of Bloggers by using Weka 3. 6 Tool, and by applying many machine...

  12. Rates of progression in diabetic retinopathy during different time periods: a systematic review and meta-analysis

    DEFF Research Database (Denmark)

    Wong, Tien Y; Mwamburi, Mkaya; Klein, Ronald;

    2009-01-01

    This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends.......This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends....

  13. Surveillance of the nuclear instrumentation by a noise analysis technique

    International Nuclear Information System (INIS)

    The nuclear sensors used in the protection channels of a nuclear reactor, have to be tested periodically. A method has been developed to estimate the state of this kind of sensor. The method proposed applies to boron ionization chambers. The principle of this technique is based on the calculation of a specific parameter named a ''descriptor'', using a simple signal processing technique. A modification of this parameter indicates a degradation of the static and dynamic performances of the sensor. Different applications of the technique in a nuclear power plant are given

  14. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  15. Multidimensional scaling technique for analysis of magnetic storms at Indian observatories

    Indian Academy of Sciences (India)

    M Sridharan; A M S Ramasamy

    2002-12-01

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  16. Recent Progresses in Analysis of Tongue Manifestation for Traditional Chinese Medicine

    Institute of Scientific and Technical Information of China (English)

    WEI Bao-guo; CAI Yi-heng; ZHANG Xin-feng; SHEN Lan-sun

    2005-01-01

    Tongue diagnosis is one of the most precious and widely used diagnostic methods in Traditional Chinese Medicine (TCM). However, due to its subjective, qualitative and experience-dependent nature, the studies on tongue characterization have been widely emphasized. This paper surveys recent progresses in analysis of tongue manifestation. These new developments include the cross-network and cross-media color reproduction of tongue image, the automatic segmentation of tongue body based on knowledge, the automatic analysis of curdiness and griminess for the tongue fur and the automatic analysis of plumpness, wryness and dot -thorn of tongue body. The clinic experiments verify the validity of these new methods.

  17. Application of ultrasonic pulse velocity technique and image analysis in monitoring of the sintering process

    Directory of Open Access Journals (Sweden)

    Terzić A.

    2011-01-01

    Full Text Available Concrete which undergoes a thermal treatment before and during its life-service can be applied in plants operating at high temperature and as thermal insulation. Sintering occurs within a concrete structure in such conditions. Progression of sintering process can be monitored by the change of the porosity parameters determined with a nondestructive test method - ultrasonic pulse velocity and computer program for image analysis. The experiment has been performed on the samples of corundum and bauxite concrete composites. The apparent porosity of the samples thermally treated at 110, 800, 1000, 1300 and 1500ºC was primary investigated with a standard laboratory procedure. Sintering parameters were calculated from the creep testing. The loss of strength and material degradation occurred in concrete when it was subjected to the increased temperature and a compressive load. Mechanical properties indicate and monitor changes within microstructure. The level of surface deterioration after the thermal treatment was determined using Image Pro Plus program. Mechanical strength was estimated using ultrasonic pulse velocity testing. Nondestructive ultrasonic measurement was used as a qualitative description of the porosity change in specimens which is the result of the sintering process. The ultrasonic pulse velocity technique and image analysis proved to be reliable methods for monitoring of microstructural change during the thermal treatment and service life of refractory concrete.

  18. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  19. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  20. Data Mining Techniques: A Source for Consumer Behavior Analysis

    OpenAIRE

    Abhijit Raorane; R.V. Kulkarni

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply...

  1. Assessing Progress towards Public Health, Human Rights, and International Development Goals Using Frontier Analysis.

    Science.gov (United States)

    Luh, Jeanne; Cronk, Ryan; Bartram, Jamie

    2016-01-01

    Indicators to measure progress towards achieving public health, human rights, and international development targets, such as 100% access to improved drinking water or zero maternal mortality ratio, generally focus on status (i.e., level of attainment or coverage) or trends in status (i.e., rates of change). However, these indicators do not account for different levels of development that countries experience, thus making it difficult to compare progress between countries. We describe a recently developed new use of frontier analysis and apply this method to calculate country performance indices in three areas: maternal mortality ratio, poverty headcount ratio, and primary school completion rate. Frontier analysis is used to identify the maximum achievable rates of change, defined by the historically best-performing countries, as a function of coverage level. Performance indices are calculated by comparing a country's rate of change against the maximum achievable rate at the same coverage level. A country's performance can be positive or negative, corresponding to progression or regression, respectively. The calculated performance indices allow countries to be compared against each other regardless of whether they have only begun to make progress or whether they have almost achieved the target. This paper is the first to use frontier analysis to determine the maximum achievable rates as a function of coverage level and to calculate performance indices for public health, human rights, and international development indicators. The method can be applied to multiple fields and settings, for example health targets such as cessation in smoking or specific vaccine immunizations, and offers both a new approach to analyze existing data and a new data source for consideration when assessing progress achieved. PMID:26812524

  2. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  3. Research Progress on Chemical Sterility Technique to Pest Control%化学不育技术在害虫防治中的研究进展

    Institute of Scientific and Technical Information of China (English)

    王建斌

    2011-01-01

    The study aimed to summarized the chemical sterility technique from the sterile principle,advantages,main kinds of sterilant,application of chemical sterility technique on pest control and research progress at home and abroad.%对化学不育技术的不育原理、优点、不育剂的主要类型、化学不育技术在害虫防治中的应用及国内外的研究进展进行了综述。

  4. Modern Computational Techniques for the HMMER Sequence Analysis

    OpenAIRE

    Xiandong Meng; Yanqing Ji

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and c...

  5. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  6. Analysis Of Machine Learning Techniques By Using Blogger Data

    Directory of Open Access Journals (Sweden)

    Gowsalya.R,

    2014-04-01

    Full Text Available Blogs are the recent fast progressing media which depends on information system and technological advancement. The mass media is not much developed for the developing countries are in government terms and their schemes are developed based on governmental concepts, so blogs are provided for knowledge and ideas sharing. This article has highlighted and performed simulations from obtained information, 100 instances of Bloggers by using Weka 3. 6 Tool, and by applying many machine learning algorithms and analyzed with the values of accuracy, precision, recall and F-measure for getting future tendency anticipation of users to blogging and using in strategical areas. Keywords -

  7. Annual progress report 1981

    International Nuclear Information System (INIS)

    This annual progress report of the CEA Protection and Nuclear Safety Institut outlines a brief description of the progress made in each section of the Institut. Research activities of the Protection department include, radiation effects on man, radioecology and environment radioprotection techniques. Research activities of the Nuclear Safety department include, reactor safety analysis, fuel cycle facilities safety analysis, safety research programs. The third section deals with nuclear material security including security of facilities, security of nuclear material transport and monitoring of nuclear material management

  8. Experimental Analysis of Small Scale PCB Manufacturing Techniques for Fablabs

    Directory of Open Access Journals (Sweden)

    Yannick Verbelen

    2013-04-01

    Full Text Available In this paper we present a complete modular PCB manufacturing process on fablab scale that is compliant with current PCB manufacturing standards. This includes, but is not limited to, a minimum track width of 8 mil, a minimum clearance of 6 mil, plated and non plated holes, a solder resist, surface finish and component overlay. We modularize industrial manufacturing processes and discuss advantages and disadvantages of production techniques for every phase. We then proceed to discuss the relevance and added value of every phase in the manufacturing process and their usefulness in a fablab context. Production techniques are evaluated regarding complexity, overhead, safety, required time, and environmental concerns. To ensure practical feasibility of the presented techniques, the manufacturing process is benchmarked in FablabXL and aims to be a practical reference for implementing or extending PCB manufacturing activities in fablabs.

  9. Radon remedial techniques in buildings - analysis of French actual cases

    International Nuclear Information System (INIS)

    The IRSN has compiled a collection of solutions from data provided by the various decentralised government services in 31 French departments. Contributors were asked to provide a description of the building, as well as details of measured radon levels, the type of reduction technique adopted and the cost. Illustrative layouts, technical drawings and photographs were also requested, when available. Of the cases recorded, 85% are establishments open to the public (schools (70%), city halls (4%) and combined city halls and school houses (26%)), 11% are houses and 4% industrial buildings. IRSN obtained 27 real cases of remedial techniques used. The data were presented in the form of fact sheets. The primary aim of this exercise was to illustrate each of the radon reduction techniques that can be used in the different building types (with basement, ground bearing slab, crawl space). This investigation not only enabled us to show that combining passive and active techniques reduces the operating cost of the installation, but above all that it considerably improves the efficiency. The passive technique reduces the amount of radon in the building and thus reduces the necessary ventilation rate, which directly affects the cost of operating the installation. For the 27 cases recorded, we noted:(a) the application of 7 passive techniques: sealing of floors and semi-buried walls, together with improved aeration by installing ventilation openings or ventilation strips in the windows. Radon concentrations were reduced on average by a factor of 4.7. No measurement in excess of 400 Bq.m-3 (the limit recommended by the French public authorities) was obtained following completion of the works; (b) the application of 15 active techniques: depressurization of the underlying ground, crawl space or basement and/or pressurization of the building. Radon concentrations were reduced on average by a factor of 13.8. Radon concentrations of over 400 Bq.m-3 were measured in only 4 cases

  10. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  11. DATA MINING TECHNIQUES: A SOURCE FOR CONSUMER BEHAVIOR ANALYSIS

    Directory of Open Access Journals (Sweden)

    Abhijit Raorane

    2011-09-01

    Full Text Available Various studies on consumer purchasing behaviors have been presented and used in real problems. Datamining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, thedata mining method has disadvantages as well as advantages.Therefore, it is important to selectappropriate techniques to mine databases. The objective of this paper is to know consumer behavior, hispsychological condition at the time of purchase and how suitable data mining method apply to improveconventional method. Moreover, in an experiment, association rule is employed to mine rules for trustedcustomers using sales data in a super market industry

  12. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  13. [An analysis of key points for root canal therapy technique].

    Science.gov (United States)

    Fan, M W

    2016-08-01

    The success rate of root canal therapy(RCT)have been improved continuously along with the advancement in RCT techniques in the past several decades. If standard procedures of modern RCT techniques are strictly followed, the success rate of RCT may exceed 90%. The success of RCT is mainly affected by such factors as clear concept of the anatomy of root canals, proper mechanical and chemical preparation and perfect filling of root canal system. If these factors are sufficiently noted, a success is easy to achieve. Even though the primary RCT fails, retreatment can further be conducted to save the diseased teeth. PMID:27511032

  14. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  15. A quantitative analysis of rotary, ultrasonic and manual techniques to treat proximally flattened root canals

    Directory of Open Access Journals (Sweden)

    Fabiana Soares Grecca

    2007-04-01

    Full Text Available OBJECTIVE: The efficiency of rotary, manual and ultrasonic root canal instrumentation techniques was investigated in proximally flattened root canals. MATERIAL AND METHODS: Forty human mandibular left and right central incisors, lateral incisors and premolars were used. The pulp tissue was removed and the root canals were filled with red die. Teeth were instrumented using three techniques: (i K3 and ProTaper rotary systems; (ii ultrasonic crown-down technique; and (iii progressive manual technique. Roots were bisected longitudinally in a buccolingual direction. The instrumented canal walls were digitally captured and the images obtained were analyzed using the Sigma Scan software. Canal walls were evaluated for total canal wall area versus non-instrumented area on which dye remained. RESULTS: No statistically significant difference was found between the instrumentation techniques studied (p<0.05. CONCLUSION: The findings of this study showed that no instrumentation technique was 100% efficient to remove the dye.

  16. New techniques for positron emission tomography in the study of human neurological disorders. Progress report, June 1990--June 1993

    Energy Technology Data Exchange (ETDEWEB)

    Kuhl, D.E.

    1993-06-01

    This progress report describes accomplishments of four programs. The four programs are entitled (1) Faster,simpler processing of positron-computing precursors: New physicochemical approaches, (2) Novel solid phase reagents and methods to improve radiosynthesis and isotope production, (3) Quantitative evaluation of the extraction of information from PET images, and (4) Optimization of tracer kinetic methods for radioligand studies in PET.

  17. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  18. Analysis on Poe's Unique Techniques to Achieve Aestheticism

    Institute of Scientific and Technical Information of China (English)

    孔佳鸣

    2008-01-01

    Edgar Allan Poe was one of the most important poets in the American poetic history for his unremitting pursuit for ‘ideal beauty'.This essay proves by various examples chosen from his poems that his aestheticism was obvious in his versification techniques.His poetic theory and practice gave an immortal example for the development of the English poetry.

  19. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  20. Sixth Australian conference on nuclear techniques of analysis: proceedings

    International Nuclear Information System (INIS)

    These proceedings contain the abstracts of 77 lectures. The topics focus on instrumentation, nuclear techniques and their applications for material science, surfaces, archaeometry, art, geological, environmental and biomedical studies. An outline of the Australian facilities available for research purposes is also provided. Separate abstracts were prepared for the individual papers in this volume

  1. Optical methods for ultrasensitive detection and analysis: Techniques and applications

    International Nuclear Information System (INIS)

    This conference is organized under the following sessions: Surface sensitive/ionization techniques; Advanced concepts in laser applications; Subwavelength spatial resolution spectroscopy; High-sensitivity spectroscopies using photothermal and polarization effects; Sensitive biomolecular detection; Novel optical spectroscopies and methods in condensed phase detection; Ultrasensitive detector methods

  2. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  3. 植物DNA条形码技术的发展及应用%Progress and application of DNA barcoding technique in plants

    Institute of Scientific and Technical Information of China (English)

    刘宇婧; 刘越; 黄耀江; 龙春林

    2011-01-01

    Based on summarization and analysis of development process of DNA barcoding technique,research progress of DNA barcoding technique in plants, its working process and analysis method,influencing factors on its identification accuracy, its application status and dispute existing in plant taxonomic study were comprehensively analyzed and described, and further development trend and application prospect of DNA barcoding technique in plants were proposed. By means of some research examples, it was indicated that combining method of DNA barcoding technique in plants and traditional botanical knowledge could be taken as one of studying ways for ethnobotany. And also, it was suggested that common DNA barcoding in plants mainly were two modes of single fragment and multiple fragments combining, and both of them had their respective advantages and disadvantages. Common DNA sequences included matK, trnH-psbA, rbcL and ITS, etc, but they all had a certain limitation. Different standards of DNA barcoding in plants should be selected in order to different application aims.Influencing factors on its identification accuracy included type and number of species, construction method of phylogenetic tree, hybridization and gene introgression, variance of species origin time and variance of molecular evolution rate. Current focus on DNA barcoding in plants is how to select suitable DNA fragments and to evaluate their values.%在对DNA条形码技术的发展过程进行归纳分析的基础上,对植物DNA条形码技术的研究进展、工作流程及分析方法、影响其鉴定准确性的因素及其在植物分类学研究中的应用现状及存在的争议进行了综合分析和阐述,并展望了植物DNA条形码技术的发展趋势及应用前景.通过具体实例说明将植物DNA条形码技术与传统植物学知识相结合可作为民族植物学的研究手段之一.认为:目前常用的植物DNA条形码主要有单一片段和多片段组合2种方式,这2种

  4. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  5. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... of the offeror's cost trends, on the basis of current and historical cost or pricing data; (C... required. (2) Price analysis shall be used when certified cost or pricing data are not required (see paragraph (b) of this subsection and 15.404-3). (3) Cost analysis shall be used to evaluate...

  6. Reduced-Order Blade Mistuning Analysis Techniques Developed for the Robust Design of Engine Rotors

    Science.gov (United States)

    Min, James B.

    2004-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo-Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using eigenfrequency curve veerings to identify "danger zones" in the operating conditions--ranges of rotational speeds and engine orders in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued. Several methods will be investigated, including the use of intentional mistuning patterns to mitigate the harmful effects of random mistuning, and the modification of disk stiffness to avoid reaching critical values of interblade coupling in the desired operating range. Recent research progress is summarized in the following paragraphs. First, significant progress was made in the development of the component mode mistuning (CMM) and static mode compensation (SMC) methods for reduced-order modeling of mistuned bladed disks (see the following figure). The CMM method has been formalized and extended to allow a general treatment of mistuning. In addition, CMM allows individual mode

  7. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  8. Air pollution studies in Tianjing city using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    Two sites of airborne sampling from industrial and residential areas were made in Tianjing city during February and June using PM-10 sampler and analyzed by NAA techniques; Comparison of air pollution between urban and rural area in Tianjing city was made using neutron activation analysis techniques and some other data analyzing techniques. (author)

  9. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  10. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  11. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  12. HPAT: A nondestructive analysis technique for plutonium and uranium solutions

    International Nuclear Information System (INIS)

    Two experimental approaches for the nondestructive characterization of mixed solutions of plutonium and uranium, developed at BNEA - C.R.E. Casaccia, with the goal of measuring low plutonium concentration (<50 g/l) even in presence of high uranium content, are described in the following. Both methods are referred to as HPAT (Hybrid Passive-Active Technique) since they rely on the measurement of plutonium spontaneous emission in the LX-rays energy region as well as the transmission of KX photons from the fluorescence induced by a radioisotopic source on a suitable target. Experimental campaigns for the characterization of both techniques have been carried out at EUREX Plant Laboratories (C.R.E. Saluggia) and at Plutonium Plant Laboratories (C.R.E. Casaccia). Experimental results and theoretical value of the errors are reported. (author)

  13. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  14. A Review on Clustering and Outlier Analysis Techniques in Datamining

    OpenAIRE

    S. Koteeswaran; P. Visu; J. Janet

    2012-01-01

    Problem statement: The modern world is based on using physical, biological and social systems more effectively using advanced computerized techniques. A great amount of data being generated by such systems; it leads to a paradigm shift from classical modeling and analyses based on basic principles to developing models and the corresponding analyses directly from data. The ability to extract useful hidden knowledge in these data and to act on that knowledge is becoming increasingly important i...

  15. Analysis of kidney stones by PIXE and RBS techniques

    International Nuclear Information System (INIS)

    Human kidney stones were analyzed by PIXE and RBS techniques using 2 MeV He++ beam. The stones were found to contain the elements: C, N, O, F, Na, Mg, Si, P, S, Cl, K, Ca, Fe and Br. Results obtained by PIXE agree with the results obtained by RBS within experimental errors. A Mechanism for the formation of the kidney stones is suggested. 3 figs., 1 tab

  16. Document analysis by means of data mining techniques

    OpenAIRE

    Jabeen, Saima

    2014-01-01

    The huge amount of textual data produced everyday by scientists, journalists and Web users, allows investigating many different aspects of information stored in the published documents. Data mining and information retrieval techniques are exploited to manage and extract information from huge amount of unstructured textual data. Text mining also known as text data mining is the processing of extracting high quality information (focusing relevance, novelty and interestingness) from text by iden...

  17. Analysis of deployment techniques for webbased applications in SMEs

    OpenAIRE

    Browne, Cathal

    2011-01-01

    The Internet is no longer just a source for accessing information; it has become a valuable medium for social networking and software services. Web-browsers can now access entire software systems available online to provide the user with a range of services. The concept of software as a service(SAAS) was born out of this. The number of development techniques and frameworks for such web-applications has grown rapidly and much research and development has been carried out on adva...

  18. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    OpenAIRE

    S.G.S. Fernando; S.N. Perera

    2015-01-01

    Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With...

  19. Empirical Analysis of Data Mining Techniques for Social Network Websites

    OpenAIRE

    S.G.S. Fernando; MdGaparMdJohar; S.N. Perera

    2014-01-01

    Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of ap...

  20. Behavior Change Techniques in Popular Alcohol Reduction Apps: Content Analysis

    OpenAIRE

    Crane, David; Garnett, Claire; Brown, Jamie; West, Robert; Michie, Susan

    2015-01-01

    Background Mobile phone apps have the potential to reduce excessive alcohol consumption cost-effectively. Although hundreds of alcohol-related apps are available, there is little information about the behavior change techniques (BCTs) they contain, or the extent to which they are based on evidence or theory and how this relates to their popularity and user ratings. Objective Our aim was to assess the proportion of popular alcohol-related apps available in the United Kingdom that focus on alco...

  1. Tape Stripping Technique for Stratum Corneum Protein Analysis

    OpenAIRE

    Maja-Lisa Clausen; H.-C. Slotved; Krogfelt, Karen A.; Tove Agner

    2016-01-01

    The aim of this study was to investigate the amount of protein in stratum corneum in atopic dermatitis (AD) patients and healthy controls, using tape stripping technique. Furthermore, to compare two different methods for protein assessment. Tape stripping was performed in AD patients and healthy controls to collect stratum corneum samples and subsequently analysed with two different methods: Squame Scan, which gives an estimate of total protein (soluble and insoluble) and Micro BCA protein de...

  2. Comparative Analysis of Techniques to Purify Plasma Membrane Proteins

    OpenAIRE

    Weekes, Michael P.; Antrobus, Robin; Lill, Jennie R.; Duncan, Lidia M; Hör, Simon; Lehner, Paul J.

    2010-01-01

    The aim of this project was to identify the best method for the enrichment of plasma membrane (PM) proteins for proteomics experiments. Following tryptic digestion and extended liquid chromatography-tandem mass spectrometry acquisitions, data were processed using MaxQuant and Gene Ontology (GO) terms used to determine protein subcellular localization. The following techniques were examined for the total number and percentage purity of PM proteins identified: (a) whole cell lysate (total numbe...

  3. Infrared Spectroscopy of Explosives Residues: Measurement Techniques and Spectral Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Mark C.; Bernacki, Bruce E.

    2015-03-11

    Infrared laser spectroscopy of explosives is a promising technique for standoff and non-contact detection applications. However, the interpretation of spectra obtained in typical standoff measurement configurations presents numerous challenges. Understanding the variability in observed spectra from explosives residues and particles is crucial for design and implementation of detection algorithms with high detection confidence and low false alarm probability. We discuss a series of infrared spectroscopic techniques applied toward measuring and interpreting the reflectance spectra obtained from explosives particles and residues. These techniques utilize the high spectral radiance, broad tuning range, rapid wavelength tuning, high scan reproducibility, and low noise of an external cavity quantum cascade laser (ECQCL) system developed at Pacific Northwest National Laboratory. The ECQCL source permits measurements in configurations which would be either impractical or overly time-consuming with broadband, incoherent infrared sources, and enables a combination of rapid measurement speed and high detection sensitivity. The spectroscopic methods employed include standoff hyperspectral reflectance imaging, quantitative measurements of diffuse reflectance spectra, reflection-absorption infrared spectroscopy, microscopic imaging and spectroscopy, and nano-scale imaging and spectroscopy. Measurements of explosives particles and residues reveal important factors affecting observed reflectance spectra, including measurement geometry, substrate on which the explosives are deposited, and morphological effects such as particle shape, size, orientation, and crystal structure.

  4. An ASIC Low Power Primer Analysis, Techniques and Specification

    CERN Document Server

    Chadha, Rakesh

    2013-01-01

    This book provides an invaluable primer on the techniques utilized in the design of low power digital semiconductor devices.  Readers will benefit from the hands-on approach which starts form the ground-up, explaining with basic examples what power is, how it is measured and how it impacts on the design process of application-specific integrated circuits (ASICs).  The authors use both the Unified Power Format (UPF) and Common Power Format (CPF) to describe in detail the power intent for an ASIC and then guide readers through a variety of architectural and implementation techniques that will help meet the power intent.  From analyzing system power consumption, to techniques that can employed in a low power design, to a detailed description of two alternate standards for capturing the power directives at various phases of the design, this book is filled with information that will give ASIC designers a competitive edge in low-power design. Starts from the ground-up and explains what power is, how it is measur...

  5. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  6. Qualitative analysis of the elliptical centric technique and the TRICKS technique

    Science.gov (United States)

    Dong, Kyung-Rae; Goo, Eun-Hoe; Lee, Jae-Seung; Chung, Woon-Kwan

    2013-02-01

    This study evaluated the usefulness of time resolved imaging of contrast kinetics (TRICKS) magnetic resonance angiography (MRA) and elliptical centric MRA according to the type of cerebral disease. From February 2010 to January 2012, elliptical centric MRA and TRICKS MRA images were acquired from 50 normal individuals and 50 patients with cerebral diseases by using 3.0-Tesla magnetic resonance imaging (MRI) equipment. The images were analyzed qualitatively by examining areas such as the presence or absence of artifacts on the images, the distinctness of boundaries of blood vessels, accurate representation of the lesions, and the subtraction level. In addition, the sensitivity, specificity, positive prediction rate, negative prediction rate and accuracy were assessed by comparing the diagnostic efficacy of the two techniques. The results revealed TRICKS MRA to have superior image quality to elliptical centric MRA. Regarding each disease, TRICKS MRA showed higher diagnostic efficacy for artery venous malformation (AVM) and middle cerebral artery (MCA) bypass patients whereas elliptical centric MRA was more suitable for patients with brain tumors, cerebral infarction, cerebral stenosis or sinus mass.

  7. IAEA progress report I - Use of handheld XRF analyzer to complement pixe technique in archeology and environment

    International Nuclear Information System (INIS)

    This is the first IAEA progress report for the period 2011-2012 (CRP number G42004). As established accelerator laboratory doing IBA applications in different fields of applications, we have recently acquired handheld XRF device for in situ analytical measurements. PIXE is widely used at the accelerator laboratory, as a part of the analytical tools performed at the Lebanese Atomic Energy Commission (LAEC). (author)

  8. Statistical analysis of heartbeat data with wavelet techniques

    Science.gov (United States)

    Pazsit, Imre

    2004-05-01

    The purpose of this paper is to demonstrate the use of some methods of signal analysis, performed on ECG and in some cases blood pressure signals, for the classification of the health status of the heart of mice and rats. Spectral and wavelet analysis were performed on the raw signals. FFT-based coherence and phase was also calculated between blood pressure and raw ECG signals. Finally, RR-intervals were deduced from the ECG signals and an analysis of the fractal dimensions was performed. The analysis was made on data from mice and rats. A correlation was found between the health status of the mice and the rats and some of the statistical descriptors, most notably the phase of the cross-spectra between ECG and blood pressure, and the fractal properties and dimensions of the interbeat series (RR-interval fluctuations).

  9. Statistical Performance Analysis and Modeling Techniques for Nanometer VLSI Designs

    CERN Document Server

    Shen, Ruijing; Yu, Hao

    2012-01-01

    Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have  become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits.  Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and ...

  10. General Approach To Materials Classification Using Neutron Analysis Techniques

    International Nuclear Information System (INIS)

    The 'neutron in, gamma out' method of elemental analysis has been known and used in many applications as an elemental analysis tool. This method is non-intrusive, non-destructive, fast and precise. This set of advantages makes neutron analysis attractive for even wider variety of uses beyond simple elemental analysis. The question that is addressed within this study is under what conditions neutron analysis can be used to differentiate materials of interest from a group or class of materials in the face of knowing that what is truly of interest is the molecular content of any sample under interrogation. Purpose of the study was to develop a neutron-based scanner for rapid differentiation of classes of materials sealed in small bottles. Developed scanner employs D-T neutron generator as a neutron source and HPGe gamma detectors. Materials can be placed into classes by many different properties. However, neutron analysis method can be used only few of them, such as elemental content, stoichiometric ratios and density of the scanned material. Set of parameters obtainable through neutron analysis serves as a basis for a hyperspace, where each point corresponds to a certain scanned material. Sub-volumes of the hyperspace correspond to different classes of materials. One of the most important properties of the materials are stoichiometric ratios of the elements comprising the materials. Constructing an algorithm for converting the observed gamma ray counts into quantities of the elements in the scanned sample is a crucial part of the analysis. Gamma rays produced in both fast inelastic scatterings and neutron captures are considered. Presence of certain elements in materials, such as hydrogen and chlorine can significantly change neutron dynamics within the sample, and, in turn, characteristic gamma lines development. These effects have been studied and corresponding algorithms have been developed to account for them

  11. Improved X-ray diagnosis of stomach by progress in the development of contrast media and examination techniques

    International Nuclear Information System (INIS)

    Three factors have been responsible for the advances during the past few years in X-ray examination of the stomach: Improvement of the contrast media used; introduction of the rare-earth foils; and examination techniques imaging all sections of the stomach and of the duodenal bulb under hypotension in double-contrast technique, in complete filling, and imaging the accessible sections by means of proper compression. An interesting technique employs a combination of two different barium sulphate suspension used at the same time, e.g. Bubbly Barium or some other barium sulphate preparation with a small amount of High-Density Barium yielding excellent image of the gastric mucosa (technique with two contrast media). (orig.)

  12. Spatial Cluster Analysis by the Bin-Packing Problem and DNA Computing Technique

    OpenAIRE

    Xiyu Liu; Jie Xue

    2013-01-01

    Spatial cluster analysis is an important data mining task. Typical techniques include CLARANS, density- and gravity-based clustering, and other algorithms based on traditional von Neumann's computing architecture. The purpose of this paper is to propose a technique for spatial cluster analysis based on sticker systems of DNA computing. We will adopt the Bin-Packing Problem idea and then design algorithms of sticker programming. The proposed technique has a better time complexity. In the case ...

  13. Pion-nucleon partial wave analysis and study of baryon structure. Progress report, June 1, 1979-May 31, 1981

    International Nuclear Information System (INIS)

    This report details progress toward completion of a long-term pion-nucleon partial wave analysis, summarizing results and conclusions to date. The report also discussed progress in using partial wave and resonance parameter results to test dynamical models of the baryon and in better understanding interquark forces within baryons

  14. Analysis of the changes in keratoplasty indications and preferred techniques.

    Directory of Open Access Journals (Sweden)

    Stefan J Lang

    Full Text Available Recently, novel techniques introduced to the field of corneal surgery, e.g. Descemet membrane endothelial keratoplasty (DMEK and corneal crosslinking, extended the therapeutic options. Additionally contact lens fitting has developed new alternatives. We herein investigated, whether these techniques have affected volume and spectrum of indications of keratoplasties in both a center more specialized in treating Fuchs' dystrophy (center 1 and a second center that is more specialized in treating keratoconus (center 2.We retrospectively reviewed the waiting lists for indication, transplantation technique and the patients' travel distances to the hospital at both centers.We reviewed a total of 3778 procedures. Fuchs' dystrophy increased at center 1 from 17% (42 to 44% (150 and from 13% (27 to 23% (62 at center 2. In center 1, DMEK increased from zero percent in 2010 to 51% in 2013. In center 2, DMEK was not performed until 2013. The percentage of patients with keratoconus slightly decreased from 15% (36 in 2009 vs. 12% (40 in 2013 in center 1. The respective percentages in center 2 were 28% (57 and 19% (51. In both centers, the patients' travel distances increased.The results from center 1 suggest that DMEK might increase the total number of keratoplasties. The increase in travel distance suggests that this cannot be fully attributed to recruiting the less advanced patients from the hospital proximity. The increase is rather due to more referrals from other regions. The decrease of keratoconus patients in both centers is surprising and may be attributed to optimized contact lens fitting or even to the effect corneal crosslinking procedure.

  15. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  16. Dynamic human operator modelling by the ESCS analysis technique

    International Nuclear Information System (INIS)

    The actions of reactor operators are determined by complex interactions of preception and diagnosis mental processes with the time evolving aspects of physical signals. Therefore an adequate modeling of operator intervention, including recovery possibility, requires dynamic techniques, such as ESCS (Event Sequence and Consequence Spectrum). As an application example, the paper presents a study case related to an interfacing system LOCA in a PWR. Identification of critical timings for operator interventions and recovery, possibility of probabilistic quantification of incident developments and consequences, optimization of procedures and man-machine interfaces can be the major results of such kind of analyses

  17. Inexpensive rf modeling and analysis techniques as applied to cyclotrons

    International Nuclear Information System (INIS)

    A review and expansion of the circuit analogy method of modeling and analysing multiconductor TEM mode rf resonators is described. This method was used to predict the performance of the NSCL K500 and K1200 cyclotron resonators and the results compared well to the measured performance. The method is currently being applied as the initial stage of the design process to optimize the performance of the rf resonators for a proposed K250 cyclotron for medical applications. Although this technique requires an experienced rf modeller, the input files tend to be simple and small, the software is very inexpensive or free, and the computer runtimes are nearly instantaneous

  18. Analysis and RHBD technique of single event transients in PLLs

    International Nuclear Information System (INIS)

    Single-event transient susceptibility of phase-locked loops has been investigated. The charge pump is the most sensitive component of the PLL to SET, and it is hard to mitigate this effect at the transistor level. A test circuit was designed on a 65 nm process using a new system-level radiation-hardening-by-design technique. Heavy-ion testing was used to evaluate the radiation hardness. Analyses and discussion of the feasibility of this method are also presented. (paper)

  19. Analysis of non-linearity in differential wavefront sensing technique.

    Science.gov (United States)

    Duan, Hui-Zong; Liang, Yu-Rong; Yeh, Hsien-Chi

    2016-03-01

    An analytical model of a differential wavefront sensing (DWS) technique based on Gaussian Beam propagation has been derived. Compared with the result of the interference signals detected by quadrant photodiode, which is calculated by using the numerical method, the analytical model has been verified. Both the analytical model and numerical simulation show milli-radians level non-linearity effect of DWS detection. In addition, the beam clipping has strong influence on the non-linearity of DWS. The larger the beam clipping is, the smaller the non-linearity is. However, the beam walking effect hardly has influence on DWS. Thus, it can be ignored in laser interferometer. PMID:26974079

  20. MARE: 187Re beta spectrum analysis with bolometric techniques

    International Nuclear Information System (INIS)

    A large worldwide collaboration is growing around the project of Microcalorimeter Arrays for a Rhenium Experiment (MARE) for a direct calorimetric measurement of the neutrino mass with a sensitivity of about 0.2 eV. Many groups are joining their experiences and technical expertises in a common effort towards this challenging experiment which will use the most recent and advanced developments of the thermal detection technique. The expected impact of MARE as a complement of the KATRIN experiment will also be discussed

  1. Progressive failure analysis of slope with strain-softening behaviour based on strength reduction method

    Institute of Scientific and Technical Information of China (English)

    Ke ZHANG; Ping CAO; Rui BAO

    2013-01-01

    Based on the strength reduction method and strain-softening model,a method for progressive failure analysis of strain-softening slopes was presented in this paper.The mutation is more pronounced in strain-softening analysis,and the mutation of displacement at slope crest was taken as critical failure criterion.An engineering example was provided to demonstrate the validity of the present method.This method was applied to a cut slope in an industry site.The results are as follows: (1) The factor of safety and the critical slip surface obtained by the present method are between those by peak and residual strength.The analysis with peak strength would lead to non-conservative results,but that with residual strength tends to be overly conservative.(2) The thickness of the shear zone considering strain-softening behaviour is narrower than that with non-softening analysis.(3) The failure of slope is the process of the initiation,propagation and connection of potential failure surface.The strength parameters are mobilized to a non-uniform degree while progressive failure occurs in the slope.(4) The factor of safety increases with the increase of residual shear strain threshold and elastic modulus.The failure mode of slope changes from shallow slip to deep slip.Poisson's ratio and dilation angle have little effect on the results.

  2. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.;

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...

  3. Cost-variance analysis by DRGs; a technique for clinical budget analysis.

    Science.gov (United States)

    Voss, G B; Limpens, P G; Brans-Brabant, L J; van Ooij, A

    1997-02-01

    In this article it is shown how a cost accounting system based on DRGs can be valuable in determining changes in clinical practice and explaining alterations in expenditure patterns from one period to another. A cost-variance analysis is performed using data from the orthopedic department from the fiscal years 1993 and 1994. Differences between predicted and observed cost for medical care, such as diagnostic procedures, therapeutic procedures and nursing care are analyzed into different components: changes in patient volume, case-mix differences, changes in resource use and variations in cost per procedure. Using a DRG cost accounting system proved to be a useful technique for clinical budget analysis. Results may stimulate discussions between hospital managers and medical professionals to explain cost variations integrating medical and economic aspects of clinical health care. PMID:10165044

  4. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  5. Modern Theory of Gratings Resonant Scattering: Analysis Techniques and Phenomena

    CERN Document Server

    Sirenko, Yuriy K

    2010-01-01

    Diffraction gratings are one of the most popular objects of analysis in electromagnetic theory. The requirements of applied optics and microwave engineering lead to many new problems and challenges for the theory of diffraction gratings, which force us to search for new methods and tools for their resolution. In Modern Theory of Gratings, the authors present results of the electromagnetic theory of diffraction gratings that will constitute the base of further development of this theory, which meet the challenges provided by modern requirements of fundamental and applied science. This volume covers: spectral theory of gratings (Chapter 1) giving reliable grounds for physical analysis of space-frequency and space-time transformations of the electromagnetic field in open periodic resonators and waveguides; authentic analytic regularization procedures (Chapter 2) that, in contradistinction to the traditional frequency-domain approaches, fit perfectly for the analysis of resonant wave scattering processes; paramet...

  6. The application of decision analysis techniques to remedial action planning

    International Nuclear Information System (INIS)

    As an illustration of a simple but powerful quantitative tool in geohydrology and waste management, alternatives for remedial action at a generic radioactive waste disposal facility have been evaluated using decision analysis. In addition to imposing a structured approach to alternative selection, decision analysis provides a rational basis for assessing the real value of further site investigations which might reduce uncertainties in site data, and for assessing the real value of developing and improving the performance of engineering controls or in decision analysis terminology--clairvoyance and wizardry, respectively. The quantitative interplay between site characterization, engineering, and institutional factors are examined for remediation of the same facility at two sites differing only in geohydrology. Alternatives considered include leaving in place and capping, in situ grouting, engineered barriers and removal. The following variables appear as uncertainties in the study: release, groundwater velocity, soil Kd, mean barrier life, and a wasteform factor

  7. Uranium trace analysis in human blood using fission track technique

    International Nuclear Information System (INIS)

    Fission track technique makes the measurement of the content of uranium in human blood. By choosing solid state nuclear track detectors of the high sensitivity, uranium can be determined using U(n,f) reaction. Blood samples are directly taken from finger and irradiated in a nuclear reactor at a thermal neutron flux of 3.5 x 1016 n/cm2. In normal human blood, the uranium contents varied from 2.43 to 3.80 x 10-10 g/ml, the average of the values is (3.06 ± 0.10) x 10-10 g/ml. In U-exposed workers blood, it varied from 3.07 to 5.57 x 10-10 g/ml, the average value is (4.53 ± 0.12) x 10-10 g/ml. In leukemia patients blood, 3.90 to 12.07 x 10-10 g/ml, the mean U-content is (7.74 ± 0.15) x 10-10 g/ml, which is 2.5 times higher than the mean of normal human blood. These results show a possible relation between the leukemia and the uranium content in the blood. If there is such a relationship, the Nuclear Fission Track Technique will be an important diagnostic tool in medicine

  8. Nigerian coal analysis by PIXE and HEBS techniques

    International Nuclear Information System (INIS)

    PIXE and HEBS techniques were employed for the measurement of the concentrations of the major, minor and trace elements in Nigerian coal samples from a major deposit. The samples were irradiated with 2.55 MeV protons from the 3 MeV tandem accelerator (NEC 3 UDH) in Lund. The PIXE results are reported and compared with an earlier work on Nigerian coal using FNAA and INAA analytical techniques while the HEBS results are compared with ASTM previous results. The results corroborate the assertion that Nigerian coals are of weak and noncoking grades with low sulphur (0.82-0.99%) and relatively high hydrogen (4.49-5.16%) contents. The motivation for this work is partly due to the projected usage of coal as metallurgical feedstocks and as fuel, and partly because of the genuine concern about the concomitant environmental effects of the increased burning of coal. The knowledge of the concentration of all elements is important for the characterization of coal and the determination and control of its products. Economic parameters such as the ash contents and calorific values are associated with the concentrations of coal's constituents. (author). 11 refs, 1 fig., 4 tabs

  9. A Comparison of seismic instrument noise coherence analysis techniques

    Science.gov (United States)

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  10. Measurement techniques for analysis of fission fragment excited gases

    Science.gov (United States)

    Schneider, R. T.; Carroll, E. E.; Davis, J. F.; Davie, R. N.; Maguire, T. C.; Shipman, R. G.

    1976-01-01

    Spectroscopic analysis of fission fragment excited He, Ar, Xe, N2, Ne, Ar-N2, and Ne-N2 have been conducted. Boltzmann plot analysis of He, Ar and Xe have indicated a nonequilibrium, recombining plasma, and population inversions have been found in these gases. The observed radiating species in helium have been adequately described by a simple kinetic model. A more extensive model for argon, nitrogen and Ar-N2 mixtures was developed which adequately describes the energy flow in the system and compares favorably with experimental measurements. The kinetic processes involved in these systems are discussed.

  11. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F.; Rovani, Jr., Joseph F.

    2011-01-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  12. A technique for the optical analysis of deformed telescope mirrors

    Science.gov (United States)

    Bolton, John F.

    1986-01-01

    The NASTRAN-ACCOS V programs' interface merges structural and optical analysis capabilities in order to characterize the performance of the NASA Goddard Space Flight Center's Solar Optical Telescope primary mirror, which has a large diameter/thickness ratio. The first step in the optical analysis is to use NASTRAN's FEM to model the primary mirror, simulating any distortions due to gravitation, thermal gradients, and coefficient of thermal expansion nonuniformities. NASTRAN outputs are then converted into an ACCOS V-acceptable form; ACCOS V generates the deformed optical surface on the basis of these inputs, and imaging qualities can be determined.

  13. Instrumental Neutron Activation Analysis Technique using Subsecond Radionuclides

    DEFF Research Database (Denmark)

    Nielsen, H.K.; Schmidt, J.O.

    1987-01-01

    The fast irradiation facility Mach-1 installed at the Danish DR 3 reactor has been used in boron determinations by means of Instrumental Neutron Activation Analysis using12B with 20-ms half-life. The performance characteristics of the system are presented and boron determinations of NBS standard...

  14. Spectroscopic analysis of soluble coffee using nuclear and atomic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Zamboni, Cibele B.; Maihara, Vera A.; Genezini, Frederico A. [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil); D. Junior, Leonardo; Bressiani, Ana H. [PROMON Engenharia SA, Sao Paulo, SP (Brazil); Camargo, Sonia P. [Universidade Santo Amaro, SP (Brazil)

    1999-11-01

    This paper describes the application of two spectroscopic methods: Gamma-Spectroscopy and Energy Dispersive Spectroscopy employed to qualitative identification of some available commercial soluble coffee. The concentration of the elements present in the samples, when it was possible, was calculated using Neutron Activation Analysis. (author) 5 refs., 3 figs., 2 tabs.

  15. Myocardial perfusion scintigraphy using a new technique - the mesh chamber. Progress report, May 1, 1982-November 1, 1983

    International Nuclear Information System (INIS)

    Work since the last progress report has concentrated in three major areas: development and construction of a multimodule slice type detector system and associated electronics at Massachusetts Institute of Technology (MIT); software development for data acquisition and reconstruction algorithms as implemented on the PDP-11/34 system at Brigham and Women's Hospital (BWH); testing of a system of 10 cm x 10 cm area cameras at BWH. The software development includes a method which we believe represents a unique approach to reconstruction of images. We report on this work in more detail in subsequent sections of this report

  16. A study of atriphos (ATP) action on muscular circulation in progressive muscular dystrophy by the radioactive xenon clearance technique

    International Nuclear Information System (INIS)

    The effect of intramuscularly and intravenously adminostered atriphos on the muscular circulation was studied with radioactive xenon in 12 children with progressive muscular dystrophy. After combined local intramuscular injection of ATP (atriphos) with the radioactive marker a 12-fold increment of muscular circulation ensues, lasting about 15 minutes. No vasodilatating effect on the muscular flow was oberved after intravenous injection of 20-40 mg of atriphos. It is believed that intramuscular administration of atriphos produced dilatation of capillaries and of the venous part of the muscular circulation. (author)

  17. Performance Analysis of Pulse Shaping Technique for OFDM PAPR Reduction

    CERN Document Server

    Kamruzzaman, S M

    2010-01-01

    Orthogonal Frequency Division Multiplexing (OFDM) is an attractive modulation and multiple access techniques for channels with a nonflat frequency response, as it saves the need for complex equalizers. It can offer high quality performance in terms of bandwidth efficiency, robustness against multipath fading and cost-effective implementation. However, its main disadvantage is the high peak-to-average power ratio (PAPR) of the output signal. As a result, a linear behavior of the system over a large dynamic range is needed and therefore the efficiency of the output amplifier is reduced. In this paper, we investigate the effect of some of these sets of time waveforms on the OFDM system performance in terms of Bit Error Rate (BER). We evaluate the system performance in AWGN channels. The obtained results indicate that the reduction in PAPR of the investigated methods is associated with considerable improvement in BER performance of the system, in multipath channels, as compared to conventional OFDM. These promisi...

  18. Atmospheric Visibility Monitoring Using Digital Image Analysis Techniques

    Science.gov (United States)

    Liaw, Jiun-Jian; Lian, Ssu-Bin; Huang, Yung-Fa; Chen, Rung-Ching

    Atmospheric visibility is a standard of human visual perception of the environment. It is also directly associated with air quality, polluted species and climate. The influence of urban atmospheric visibility affects not only human health but also traffic safety and human life quality. Visibility is traditionally defined as the maximum distance at which a selected target can be recognized. To replace the traditional measurement for atmospheric visibility, digital image processing schemes provide good visibility data, established by numerical index. The performance of these techniques is defined by the correlation between the observed visual range and the obtained index. Since performance is affected by non-uniform illumination, this paper proposes a new procedure to estimate the visibility index with a sharpening method. The experimental results show that the proposed procedure obtains a better correlation coefficient than previous schemes.

  19. ANALYSIS OF WATERMARKING TECHNIQUES FOR MEDICAL IMAGES PRESERVING ROI

    Directory of Open Access Journals (Sweden)

    Sonika C. Rathi

    2012-05-01

    Full Text Available Telemedicine is a well-known application, where enormous amount of medical data need to be securely transfer over the public network and manipulate effectively. Medical image watermarking is an appropriate method used for enhancing security and authentication of medical data, which is crucial and used for further diagnosis and reference. This paper discusses the available medical image watermarking methods for protecting and authenticating medical data. The paper focuses on algorithms for application of watermarking technique on Region of Non Interest (RONI of the medical image preserving Region of Interest (ROI. The medical images can be transferred securely by embedding watermarks in RONI allowing verification of the legitimate changes at the receiving end without affecting ROI.

  20. Spectral analysis and filtering techniques in digital spatial data processing

    Science.gov (United States)

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  1. Analysis of ultrasonic techniques for monitoring milk coagulation during cheesemaking

    International Nuclear Information System (INIS)

    Experimental determination of time of flight and attenuation has been proposed in the literature as alternatives to monitoring the evolution of milk coagulation during cheese manufacturing. However, only laboratory scale procedures have been described. In this work, the use of ultrasonic time of flight and attenuation to determine cutting time and its feasibility to be applied at industrial scale were analyzed. Limitations to implement these techniques at industrial scale are shown experimentally. The main limitation of the use of time of flight is its strong dependence with temperature. Attenuation monitoring is affected by a thin layer of milk skin covering the transducer, which modifies the signal in a non-repetitive way. The results of this work can be used to develop alternative ultrasonic systems suitable for application in the dairy industry.

  2. An Analysis of Performance Enhancement Techniques for Overset Grid Applications

    Science.gov (United States)

    Djomehri, J. J.; Biswas, R.; Potsdam, M.; Strawn, R. C.; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The overset grid methodology has significantly reduced time-to-solution of high-fidelity computational fluid dynamics (CFD) simulations about complex aerospace configurations. The solution process resolves the geometrical complexity of the problem domain by using separately generated but overlapping structured discretization grids that periodically exchange information through interpolation. However, high performance computations of such large-scale realistic applications must be handled efficiently on state-of-the-art parallel supercomputers. This paper analyzes the effects of various performance enhancement techniques on the parallel efficiency of an overset grid Navier-Stokes CFD application running on an SGI Origin2000 machine. Specifically, the role of asynchronous communication, grid splitting, and grid grouping strategies are presented and discussed. Results indicate that performance depends critically on the level of latency hiding and the quality of load balancing across the processors.

  3. Practice patterns in FNA technique: A survey analysis

    Institute of Scientific and Technical Information of China (English)

    Christopher; J; DiMaio; Jonathan; M; Buscaglia; Seth; A; Gross; Harry; R; Aslanian; Adam; J; Goodman; Sammy; Ho; Michelle; K; Kim; Shireen; Pais; Felice; Schnoll-Sussman; Amrita; Sethi; Uzma; D; Siddiqui; David; H; Robbins; Douglas; G; Adler; Satish; Nagula

    2014-01-01

    AIM: To ascertain fine needle aspiration(FNA) tech-niques by endosonographers with varying levels of ex-perience and environments.METHODS: A survey study was performed on United States based endosonographers. The subjects complet-ed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and prac-tice environment.RESULTS: A total of 210(30.8%) endosonographers completed the survey. Just over half(51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents(77.1%) identified themselves as high-volume endoscopic ultrasound(EUS)(> 150 EUS/year) and high-volume FNA(> 75 FNA/year) performers(73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle(60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy,(33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle(66.7%) compared to community physicians(40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment.

  4. Correlation analysis of energy indicators for sustainable development using multivariate statistical techniques

    International Nuclear Information System (INIS)

    Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)

  5. Biomechanical analysis technique choreographic movements (for example, "grand battman jete"

    Directory of Open Access Journals (Sweden)

    Batieieva N.P.

    2015-04-01

    Full Text Available Purpose : biomechanical analysis of the execution of choreographic movement "grand battman jete". Material : the study involved students (n = 7 of the department of classical choreography faculty of choreography. Results : biomechanical analysis of choreographic movement "grand battman jete" (classic exercise, obtained kinematic characteristics (path, velocity, acceleration, force of the center of mass (CM bio parts of the body artist (foot, shin, thigh. Built bio kinematic model (phase. The energy characteristics - mechanical work and kinetic energy units legs when performing choreographic movement "grand battman jete". Conclusions : It was found that the ability of an athlete and coach-choreographer analyze the biomechanics of movement has a positive effect on the improvement of choreographic training of qualified athletes in gymnastics (sport, art, figure skating and dance sports.

  6. Applications of string mining techniques in text analysis

    OpenAIRE

    Horațiu Mocian

    2012-01-01

    The focus of this project is on the algorithms and data structures used in string mining and their applications in bioinformatics, text mining and information retrieval. More specific, it studies the use of suffix trees and suffix arrays for biological sequence analysis, and the algorithms used for approximate string matching, both general ones and specialized ones used in bioinformatics, like the BLAST algorithm and PAM substitution matrix. Also, an attempt is made to apply these structures ...

  7. Advanced Techniques in Pulmonary Function Test Analysis Interpretation and Diagnosis

    OpenAIRE

    Gildea, T.J.; Bell, C. William

    1980-01-01

    The Pulmonary Functions Analysis and Diagnostic System is an advanced clinical processing system developed for use at the Pulmonary Division, Department of Medicine at the University of Nebraska Medical Center. The system generates comparative results and diagnostic impressions for a variety of routine and specialized pulmonary functions test data. Routine evaluation deals with static lung volumes, breathing mechanics, diffusing capacity, and blood gases while specialized tests include lung c...

  8. Methods and techniques for bio-system's materials behaviour analysis

    OpenAIRE

    MITU, LEONARD GABRIEL

    2014-01-01

    In the context of the rapid development of the research in the biosystem structure materials domain, a representative direction shows the analysis of the behavior of these materials. This direction of research requires the use of various means and methods for theoretical and experimental measuring and evaluating. PhD thesis "Methods and means for analyzing the behavior of biosystems structure materials" shall be given precedence in this area of research, aimed at studying the behavior of poly...

  9. A predictive cognitive error analysis technique for emergency tasks

    International Nuclear Information System (INIS)

    This paper introduces an analysis framework and procedure for the support of cognitive error analysis of emergency tasks in nuclear power plants. The framework provides a new perspective in the utilization of error factors into error prediction. The framework can be characterized by two features. First, error factors that affect the occurrence of human error are classified into three groups, 'task characteristics factors (TCF)', 'situation factors (SF)', and 'performance assisting factors (PAF)', and are utilized in the error prediction. This classification aims to support error prediction from the viewpoint of assessing the adequacy of PAF under given TCF and SF. Second, the assessment of error factors is made in the perspective of the performance of each cognitive function. Through this, error factors assessment is made in an integrative way not independently. Furthermore, it enables analysts to identify vulnerable cognitive functions and error factors, and to obtain specific error reduction strategies. Finally, the framework and procedure was applied to the error analysis of the 'bleed and feed operation' of emergency tasks

  10. Technique of sample preparation for analysis of gasoline and lubricating oils by X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    The X-ray fluorescence laboratory of the National Institute of Nuclear Research when not having a technique for the analysis of oils it has intended, with this work, to develop a preparation technique for the analysis of the metals of Pb, Cr, Ni, V and Mo in gasolines and oils, by means of the spectrometry by X-ray fluorescence analysis. The obtained results, its will be of great utility for the one mentioned laboratory. (Author)

  11. Progressive Collapse Analysis of Steel Framed Structures with I-Beams and Truss Beams using Linear Static Procedure

    OpenAIRE

    Fadaei, Sepideh

    2012-01-01

    ABSTRACT: Progressive collapse starts with a local damage or loss of some members of the structure leading to failure at large parts of a structure. Due to the recent disastrous events like world Trade Center in USA, taking measures in reducing the potential of progressive collapse (PC) of structures during the analysis and design stages is becoming a necessity for the structures. A number of computational analysis programs, such as ETABS, SAP2000, ABAQUS can be used to simulate the structure...

  12. Development of reconstitution technique of irradiated specimen. Progress report for FY1993 on cooperated research between JAERI and IHI

    International Nuclear Information System (INIS)

    Regulatory codes require the surveillance test to evaluate the irradiation embrittlement of reactor pressure vessel steel during operation. However, it is anticipated that the number of those specimens is insufficient in case plant life is extended. Reconstitution techniques by electron beam weld, laser weld, arc stud weld as well as surface-activated joining (SAJ) have been investigated for the reuse of undeformed parts from tested Charpy impact specimen. The important items for the reconstitution technique are to reduce the width of heat affected zone to maximize the material available, and to lower the maximum temperature of specimen during joining process to preclude the recovery of radiation damage. SAJ can be achieved from a removal of surface contamination by rotating one-side specimen in vacuum with applying modest friction force. Therefore, SAJ method is expected to be suitable for specimen reconstitution in view of material heating and melting. This paper describes preliminary study to develop Charpy specimen reconstitution technique using reactor pressure vessel steel, A533B-1, by SAJ method. Test results showed that the SAJ method had a capability of joining affected zone less than 1.5 mm in half width, and over-temperature region less than 3 mm in half width above reactor operating temperature during joining. It was also found that transition temperature from reconstituted Charpy specimen could be evaluated. It can be concluded from these results that SAJ method is attractive technique for reconstituting the irradiated surveillance specimen. (author)

  13. Radiative neutron capture as a counting technique at pulsed spallation neutron sources: a review of current progress.

    Science.gov (United States)

    Schooneveld, E M; Pietropaolo, A; Andreani, C; Perelli Cippo, E; Rhodes, N J; Senesi, R; Tardocchi, M; Gorini, G

    2016-09-01

    Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources. PMID:27502571

  14. Radiative neutron capture as a counting technique at pulsed spallation neutron sources: a review of current progress

    Science.gov (United States)

    Schooneveld, E. M.; Pietropaolo, A.; Andreani, C.; Perelli Cippo, E.; Rhodes, N. J.; Senesi, R.; Tardocchi, M.; Gorini, G.

    2016-09-01

    Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources.

  15. Sensitivity analysis and performance estimation of refractivity from clutter techniques

    Science.gov (United States)

    Yardim, Caglar; Gerstoft, Peter; Hodgkiss, William S.

    2009-02-01

    Refractivity from clutter (RFC) refers to techniques that estimate the atmospheric refractivity profile from radar clutter returns. A RFC algorithm works by finding the environment whose simulated clutter pattern matches the radar measured one. This paper introduces a procedure to compute RFC estimator performance. It addresses the major factors such as the radar parameters, the sea surface characteristics, and the environment (region, time of the day, season) that affect the estimator performance and formalizes an error metric combining all of these. This is important for applications such as calculating the optimal radar parameters, selecting the best RFC inversion algorithm under a set of conditions, and creating a regional performance map of a RFC system. The performance metric is used to compute the RFC performance of a non-Bayesian evaporation duct estimator. A Bayesian estimator that incorporates meteorological statistics in the inversion is introduced and compared to the non-Bayesian estimator. The performance metric is used to determine the optimal radar parameters of the evaporation duct estimator for six scenarios. An evaporation duct inversion performance map for a S band radar is created for the larger Mediterranean/Arabian Sea region.

  16. Chromatographic finger print analysis of Naringi crenulata by HPTLC technique

    Institute of Scientific and Technical Information of China (English)

    Subramanian Sampathkumar; Ramakrishnan N

    2011-01-01

    Objective:To establish the fingerprint profile of Naringi crenulata (N. crenulata) (Roxb.) Nicols. using high performance thin layer chromatography (HPTLC) technique. Methods: Preliminary phytochemical screening was done and HPTLC studies were carried out. CAMAG HPTLC system equipped with Linomat V applicator, TLC scanner 3, Reprostar 3 and WIN CATS-4 software was used. Results: The results of preliminary phytochemical studies confirmed the presence of protein, lipid, carbohydrate, reducing sugar, phenol, tannin, flavonoid, saponin, triterpenoid, alkaloid, anthraquinone and quinone. HPTLC finger printing of ethanolic extract of stem revealed 10 spots with Rf values in the range of 0.08 to 0.65;bark showed 8 peaks with Rf values in the range of 0.07 to 0.63 and the ethanol extract of leaf revealed 8 peaks with Rf values in the range of 0.09 to 0.49, respectively. The purity of sample was confirmed by comparing the absorption spectra at start, middle and end position of the band. Conclusions:It can be concluded that HPTLC finger printing of N. crenulata may be useful in differentiating the species from the adulterant and act as a biochemical marker for this medicinally important plant in the pharmaceutical industry and plant systematic studies.

  17. Skills and Vacancy Analysis with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Izabela A. Wowczko

    2015-11-01

    Full Text Available Through recognizing the importance of a qualified workforce, skills research has become one of the focal points in economics, sociology, and education. Great effort is dedicated to analyzing labor demand and supply, and actions are taken at many levels to match one with the other. In this work we concentrate on skills needs, a dynamic variable dependent on many aspects such as geography, time, or the type of industry. Historically, skills in demand were easy to evaluate since transitions in that area were fairly slow, gradual, and easy to adjust to. In contrast, current changes are occurring rapidly and might take an unexpected turn. Therefore, we introduce a relatively simple yet effective method of monitoring skills needs straight from the source—as expressed by potential employers in their job advertisements. We employ open source tools such as RapidMiner and R as well as easily accessible online vacancy data. We demonstrate selected techniques, namely classification with k-NN and information extraction from a textual dataset, to determine effective ways of discovering knowledge from a given collection of vacancies.

  18. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1992-01-01

    This report covers the last quarter of the last year of the three-year grant period. In the final project year, we concentrated on the pyrolysis and oxidative pyrolysis of large hydrocarbons and mixtures of large and small hydrocarbons in order to develop the VUV-MS technique for compounds more representative of those in coal pyrolysis applications. Special focus was directed at the pyrolysis and oxidative pyrolysis of benzene and benzene acetylene mixtures. The acetylene/benzene mixtures were used to gain a better understanding of the mechanisms of molecular growth in such systems specifically to look at the kinetics of aryl-aryl reactions as opposed to small molecule addition to phenyl radicals. Sarofim and coworkers at MIT have recently demonstrated the importance of these reactions in coal processing environments. In the past, the growth mechanism for the formation of midsized PAH has been postulated to involve primarily successive acetylene additions to phenyl-type radicals, our work confmns this as an important mechanism especially for smaller PAH but also investigates conditions where biaryl formation can play an important role in higher hydrocarbon formation.

  19. ANALYSIS OF MALIGNANT NEOPLASTIC USING IMAGE PROCESSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    N.R.Raajan

    2013-04-01

    Full Text Available Breast Cancer is the second most frequently diagnosed cancer, leading to cancer death worldwide. Early detection of breast cancer saves lives. Mammograms play an important tool in early detection of breast cancer. Cancer that originates from the breast tissue is called as Breast Cancer. Cancer originating from the inner lining of milk ducts is called as Ductal Carcinomas (70%. Cancer originating from the lobules which are the glands that produce milk is called as Lobular Carcinomas (15%. Breast Cancer occurs in humans and other mammals. Every 74 seconds, somewhere in the world, someone dies from breast cancer, in which the majority is woman. Approximately 425,000 women around the world died from the disease in 2010. At this rate, 10.6 million women will die from breast cancer during the next 25 years. It is 100 times more common in women than in men. Mammography is aspecific type of imaging that uses a low dose X-Ray system to examine breast cancers. Mammography exam is called Mammogram. In our proposed project, Image processing techniques are used in accurate and timely detection of Breast Cancer in high resolution medical images. Collected images from the database are segmented using Marker-Controlled Watershed segmentation method. The segmented image is then enhanced and the features are extracted using Gabor filter. Another methodology of Circular Hough Transform is used to obtain 3-Dimensional image of the Cancer.

  20. Tritium analysis in hydraulic oil waste by oxidation technique

    Energy Technology Data Exchange (ETDEWEB)

    Dianu, Magdalena, E-mail: mdianu@yahoo.com [Institute for Nuclear Research, Pitesti, 1 Campului Str., Mioveni, Arges 115400 (Romania)

    2012-08-15

    The aim of the present study is to investigate a method to evaluate the tritium activity in hydraulic oil waste generated during the operation of Romanian Cernavoda Nuclear Power Plant. The method is based on a combustion technique using the 307 PerkinElmer{sup Registered-Sign} Sample Oxidizer model. The hydraulic oil samples must be processed prior to counting to avoid color quenching (the largest source of inaccuracy) because these samples absorb in the region of 200-500 nm, where scintillation phosphors emit. Prior to combustion of the hydraulic oil waste, tritium recovery degree and tritium retention degree in the circuits of combustion system were evaluated as higher than 98% and less than 0.08%, respectively. After combustion, tritium activity was measured by a 2100 Tri-Carb{sup Registered-Sign} Packard model liquid scintillation analyzer. The blank counts were 16.25 {+-} 0.50 counts/min, measured for 60 min. The significant activity level value was 6.53 counts/min, at a preselected confidence level of 95%. The Minimum Detectable Activity of a 0.2 mL hydraulic oil sample was calculated to 1.09 Bq/mL. Therefore, the developed method is sensitive enough for the tritium evaluation in the ordinary hydraulic oil waste samples.

  1. Analysis of a digital technique for frequency transposition of speech

    Science.gov (United States)

    Digirolamo, V.

    1985-09-01

    Frequency transposition is the process of raising or lowering the frequency content (pitch) of an audio signal. The hearing impaired community has the greatest interest in the applications of frequency transposing. Though several analog and digital frequency transposing hearing aid systems have been built and tested, this thesis investigates a possible digital processing alternative. Pole shifting, in the z-domain, of an autoregressive (all pole) model of speech was proven to be a viable theory for changing frequency content. Since linear predictive coding (LPC) techniques are used to code, analyze and synthesize speech, with the resulting LPC coefficients related to the coefficients of an equivalent autoregressive model, a linear relationship between LPC coefficients and frequency tranposition is explored. This theoretical relationship is first established using a pure sine wave and then is extended into processing speech. The resulting speech synthesis experiments failed to substantiate the conjectures of this thesis. However, future research avenues are suggested that may lead toward a viable approach to transpose speech.

  2. Data mining techniques for performance analysis of onshore wind farms

    International Nuclear Information System (INIS)

    Highlights: • Indicators are formulated for monitoring quality of wind turbines performances. • State dynamics is processed for formulation of two Malfunctioning Indexes. • Power curve analysis is revisited. • A novel definition of polar efficiency is formulated and its consistency is checked. • Mechanical effects of wakes are analyzed as nacelle stationarity and misalignment. - Abstract: Wind turbines are an energy conversion system having a low density on the territory, and therefore needing accurate condition monitoring in the operative phase. Supervisory Control And Data Acquisition (SCADA) control systems have become ubiquitous in wind energy technology and they pose the challenge of extracting from them simple and explanatory information on goodness of operation and performance. In the present work, post processing methods are applied on the SCADA measurements of two onshore wind farms sited in southern Italy. Innovative and meaningful indicators of goodness of performance are formulated. The philosophy is a climax in the granularity of the analysis: first, Malfunctioning Indexes are proposed, which quantify goodness of merely operational behavior of the machine, irrespective of the quality of output. Subsequently the focus is shifted to the analysis of the farms in the productive phase: dependency of farm efficiency on wind direction is investigated through the polar plot, which is revisited in a novel way in order to make it consistent for onshore wind farms. Finally, the inability of the nacelle to optimally follow meandering wind due to wakes is analysed through a Stationarity Index and a Misalignment Index, which are shown to capture the relation between mechanical behavior of the turbine and degradation of the power output

  3. Technique for particle flow measuring in activation analysis

    International Nuclear Information System (INIS)

    The invention is refered to the methods of measuring particle flow in nuclear-physical methods of substance composition monitoring. The purpose of the invention is to simplify the process of particle flux measurement and improve the accuracy of analysis. To do this, ''clean'' foil is located behind the monitor, when irradiating ''thin'' monitor, located in front of the sample, and measuring induced radioactivity of radionuclide, produced from the monitor basic element. The value oa particle flow is assessed according to activity of radionuclide nuclei, introduced from monitor into the foil, due to nuclear transformation energy. Monitor thickness should exceed the maximal path of radionuclide nuclei in monitor substance. 1 fig

  4. Analysis of archaeological ceramics using x-ray fluorescence technique

    International Nuclear Information System (INIS)

    Radioisotope x-ray fluorescence method was applied to provenance studies of ceramics fragments originated from the Mar-Takla site in Syria. 35 samples were analyzed, where each sample was irradiated 1000 s by 109Cd radioisotope source and the elements (As, Ca, fe, Ga, Nb, Mn, Pb, Rb, Sr, Ti, Y, Zn, and Zr) were determined. The data were subjected to two multivariate statistical methods, cluster and principal component analysis (PCA). The study show that 94% of the samples can be considered to be manufactured using two sources of raw materials. (Author)

  5. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chap-ter presents an overview of the various types of operation refer-ences: constitutive equations, set points, design parameters, com-ponent characteristics etc., and their validity in different situa......-tions. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engi-neer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is...

  6. Preliminary Analysis of ULPC Light Curves Using Fourier Decomposition Technique

    CERN Document Server

    Ngeow, Chow-Choong; Kanbur, Shashi; Barrett, Brittany; Lin, Bin

    2013-01-01

    Recent work on Ultra Long Period Cepheids (ULPCs) has suggested their usefulness as a distance indicator, but has not commented on their relationship as compared with other types of variable stars. In this work, we use Fourier analysis to quantify the structure of ULPC light curves and compare them to Classical Cepheids and Mira variables. Our preliminary results suggest that the low order Fourier parameters of ULPCs show a continuous trend defined by Classical Cepheids after the resonance around 10 days. However their Fourier parameters also overlapped with those from Miras, which make the classification of long period variable stars difficult based on the light curves information alone.

  7. Digital radiographic techniques in the analysis of paintings

    International Nuclear Information System (INIS)

    In this chapter the authors use the term digital radiography to mean any method of radiographic image production in which the silver halide-based film is replaced by an electronic sensor for production of an image. There are essentially three types of digital radiographic systems available at present, but others will be developed. These differ primarily in the method of image production and the rapidity with which images can be produced. The three methods discussed are digital fluoroscopy, scanned projection radiography, and the scanned point source radiography. Each has certain characteristics which, if properly utilized, will allow improved x-ray analysis of paintings

  8. A Generalized Lanczos-QR Technique for Structural Analysis

    DEFF Research Database (Denmark)

    Vissing, S.

    systems with very special properties. Due to the finite discretization the matrices are sparse and a relatively large number of problems also has real and symmetric matrices. The matrix equation for an undamped vibration contains two matrices describing tangent stiffness and mass distributions....... Alternatively, in a stability analysis, tangent stiffness and geometric stiffness matrices are introduced into an eigenvalue problem used to determine possible bifurcation points. The common basis for these types of problems is that the matrix equation describing the problem contains two real, symmetric and...

  9. Sample preparation techniques of biological material for isotope analysis

    International Nuclear Information System (INIS)

    Sample preparation is an essential step in all isotope-aided experiments but often it is not given enough attention. The methods of sample preparation are very important to obtain reliable and precise analytical data and for further interpretation of results. The size of a sample required for chemical analysis is usually very small (10mg-1500mg). On the other hand the amount of harvested plant material from plots in a field experiment is often bulky (several kilograms) and the entire sample is too large for processing. In addition, while approaching maturity many crops show not only differences in physical consistency but also a non-uniformity in 15N content among plant parts, requiring a plant fractionation or separation into parts (vegetative and reproductive) e.g. shoots and spikes, in case of small grain cereals, shoots and pods in case of grain legumes and tops and roots or beets (including crown) in case of sugar beet, etc. In any case the ultimate goal of these procedures is to obtain representative subsample harvested from greenhouse or field experiments for chemical analysis. Before harvesting an isotopic-aided experiment the method of sampling has to be selected. It should be based on the type of information required in relation to the objectives of the research and the availability of resources (staff, sample preparation equipment, analytical facilities, chemicals and supplies, etc.). 10 refs, 3 figs, 3 tabs

  10. Evaluation of condensation oscillation loads using spectral analysis techniques

    International Nuclear Information System (INIS)

    An experimental test program was started in the United States in 1978 to define and quantify the Condensation Oscillation (CO) phenomena in General Electric Mark I Suppression Chamber Systems. The program was funded by utilities with Mark I containments, and the results were included in a detailed load definition in the Mark I Containment Program Load Definition Report (LDR). The United States Nuclear Regulatory Commission (USNRC) has reviewed and approved the LDR CO load definition. Spectral analysis methods are employed to determine the significant load distributions and applied loading frequencies acting on the vent system during the CO phenomena. The case of a single input/output system is assumed. Test data obtained from full scale testing is utilized in this evaluation and consists of downcomer pressures and vent header hoop stress at the intersection of the downcomers. Spectral quantities investigated include auto-spectrum, cross-spectrum, ordinary coherence, and transfer functions. Transfer functions from the spectral analysis are evaluated against harmonic response functions from a finite element model of the vent system

  11. Cost analysis of soil depressurization techniques for indoor radon reduction

    International Nuclear Information System (INIS)

    A parametric cost analysis was conducted to determine the importance of various system design and operating variables on the installation and operating costs of active soil depressurization (ASD) systems for indoor radon reduction in houses. The objective was to help guide the research and development (R and D) efforts of the U.S. Environmental Protection Agency (EPA) to reduce ASD costs. Annual lung cancer deaths due to radon cannot be reduced by more than about 14% to 22% unless houses having pre-mitigation levels of 148 Bq/m3 and less receive radon reduction systems. Reductions in ASD costs might increase voluntary use of this technology by homeowners at those levels. The analysis showed that various modifications to ASD system designs offer potential for reducing installation costs by up to several hundred dollars, but would not reduce total installed costs mush below dollar 800 - dollar 1000. Reductions of this magnitude would probably not be sufficient to dramatically increase voluntary use of ASD technology. Thus, some innovative, inexpensive mitigation approach other than ASD would appear to be necessary. Decreased ASD fan capacity and increased sealing might reduce ASD operation costs (for fan electricity and house heating/cooling) by roughly dollar 7.50 per month. It is unlikely that this amount would be a deciding factor for most homeowners. (au)

  12. HPLC-MS technique for radiopharmaceuticals analysis and quality control

    International Nuclear Information System (INIS)

    Potentialities of liquid chromatography with mass spectrometric detector (MSD) were investigated with the objective of quality control of radiopharmaceuticals; 2-deoxy-2-[18F]fluoro-D-glucose (FDG) being an example. Screening of suitable MSD analytical lines is presented. Mass-spectrometric monitoring of acetonitrile-aqueous ammonium formate eluant by negatively charged FDG.HCO2- ions enables isotope analysis (specific activity) of the radiopharmaceutical at m/z 227 and 226. Kryptofix 222 provides an intense MSD signal of the positive ion associated with NH4+ at m/z 394. Expired FDG injection samples contain decomposition products from which at least one labelled by 18F and characterised by signal of negative ions at m/z 207 does not correspond to FDG fragments but to C5 decomposition products. A glucose chromatographic peak, characterised by m/z 225 negative ion is accompanied by a tail of a component giving a signal of m/z 227, which can belong to [18O]glucose; isobaric sorbitol signals were excluded but FDG-glucose association occurs in the co-elution of separation of model mixtures. The latter can actually lead to a convoluted chromatographic peak, but the absence of 18F makes this inconsistent. Quantification and validation of the FDG component analysis is under way. (author)

  13. Study on data analysis techniques in gravitational wave detectors

    International Nuclear Information System (INIS)

    This work initially investigates the possibility of the use of an innovative time-frequency transform, known as S transform, for the data analysis of the gravitational wave detector ALLEGRO. It is verified that its utility for this kind of detector is limited due to the detectors narrow bandwidth. However, it is argued that the S transform may be useful for interferometric detectors. Then a robust data analysis method is presented based on a hypothesis test known as Neyman-Pearson criteria, which allows the determination of candidate burst events. The method consists in the construction of probability distribution functions for the weighted average energy of the data blocks registered by the detector, both in the case of absence of noise and the case of signal mixed with noise. Based on these distributions it is possible to determine the probability that the data block in which a candidate event is present does not coincide with a noise block. This way of searching candidate signals immersed in noise agrees with another method present in the literature. One concludes that this is a promising method since it does not demand the use of a more refined search for candidate events, thus reducing computational processing time. (author)

  14. The composite sequential clustering technique for analysis of multispectral scanner data

    Science.gov (United States)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  15. Structural analysis of irradiated crotoxin by spectroscopic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina C. de; Fucase, Tamara M.; Silva, Ed Carlos S. e; Chagas, Bruno B.; Buchi, Alisson T.; Viala, Vincent L.; Spencer, Patrick J.; Nascimento, Nanci do, E-mail: kcorleto@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Biotecnologia

    2013-07-01

    Snake bites are a serious public health problem, especially in subtropical countries. In Brazil, the serum, the only effective treatment in case of snake bites, is produced in horses which, despite of their large size, have a reduced lifespan due to the high toxicity of the antigen. Ionizing radiation has been successfully employed to attenuate the biological activity of animal toxins. Crotoxin, the main toxic compound from Crotalus durissus terrificus (Cdt), is a heterodimeric protein composed of two subunits: crotapotin and phospholipase A{sub 2}. Previous data indicated that this protein, following irradiation process, undergoes unfolding and/or aggregation, resulting in a much lower toxic antigen. The exact mechanisms and structural modifications involved in aggregation process are not clear yet. This work investigates the effects of ionizing radiation on crotoxin employing Infrared Spectroscopy, Circular Dichroism and Dynamic Light Scattering techniques. The infrared spectrum of lyophilized crotoxin showed peaks corresponding to the vibrational spectra of the secondary structure of crotoxin, including β-sheet, random coil, α-helix and β-turns. We calculated the area of these spectral regions after adjusting for baseline and normalization using the amide I band (1590-1700 cm{sup -1}), obtaining the variation of secondary structures of the toxin following irradiation. The Circular Dichroism spectra of native and irradiated crotoxin suggests a conformational change within the molecule after the irradiation process. This data indicates structural changes between the samples, apparently from ordered conformation towards a random coil. The analyses by light scattering indicated that the irradiated crotoxin formed multimers with an average molecular radius 100 folds higher than the native toxin. (author)

  16. Structural analysis of irradiated crotoxin by spectroscopic techniques

    International Nuclear Information System (INIS)

    Snake bites are a serious public health problem, especially in subtropical countries. In Brazil, the serum, the only effective treatment in case of snake bites, is produced in horses which, despite of their large size, have a reduced lifespan due to the high toxicity of the antigen. Ionizing radiation has been successfully employed to attenuate the biological activity of animal toxins. Crotoxin, the main toxic compound from Crotalus durissus terrificus (Cdt), is a heterodimeric protein composed of two subunits: crotapotin and phospholipase A2. Previous data indicated that this protein, following irradiation process, undergoes unfolding and/or aggregation, resulting in a much lower toxic antigen. The exact mechanisms and structural modifications involved in aggregation process are not clear yet. This work investigates the effects of ionizing radiation on crotoxin employing Infrared Spectroscopy, Circular Dichroism and Dynamic Light Scattering techniques. The infrared spectrum of lyophilized crotoxin showed peaks corresponding to the vibrational spectra of the secondary structure of crotoxin, including β-sheet, random coil, α-helix and β-turns. We calculated the area of these spectral regions after adjusting for baseline and normalization using the amide I band (1590-1700 cm-1), obtaining the variation of secondary structures of the toxin following irradiation. The Circular Dichroism spectra of native and irradiated crotoxin suggests a conformational change within the molecule after the irradiation process. This data indicates structural changes between the samples, apparently from ordered conformation towards a random coil. The analyses by light scattering indicated that the irradiated crotoxin formed multimers with an average molecular radius 100 folds higher than the native toxin. (author)

  17. Multivariate analysis of progressive thermal desorption coupled gas chromatography-mass spectrometry.

    Energy Technology Data Exchange (ETDEWEB)

    Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel; Borek, Theodore Thaddeus, III

    2010-09-01

    Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that vary as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief

  18. New spectroscopic techniques for wine analysis and characterization

    International Nuclear Information System (INIS)

    The objective of the presented thesis was the development of new, rapid tools for wine analysis based on Fourier transform infrared (FTIR) and Ultraviolet/Visible (UV/Vis) - spectroscopy. The results of this thesis are presented in the form of five publications. In publication I a sensor for assessing the main sensory property of red wine polyphenols (tannins), namely astringency, was developed on basis of the underlying chemical reaction between the tannins and the proline-rich proteins in the saliva. The interaction of polyphenols (tannins) with proline rich proteins (gelatin) has been studied using an automated flow injection system with FTIR detection. In Publication II FTIR-spectroscopy of polyphenolic wine extracts combined with multivariate data analysis was applied for the varietal discrimination of Austrian red wines. By hierarchical clustering it could be shown that the mid-infrared spectra of the dry extracts contain information on the varietal origin of wines. The classification of the wines was successfully performed by soft independent modeling of class analogies (SIMCA). Publication III describes the determination of carbohydrates, alcohols and organic acids in red wine by Ion-exchange high performance liquid chromatography hyphenated with FTIR-detection, where a diamond attenuated total reflectance (ATR)-element was employed for the design of a rugged detector. Partly or completely co-eluting peaks were chemometrically resolved by multivariate curve resolution - alternating least squares (MCR-ALS). Publication IV reports the first application of a mid-infrared quantum cascade laser (QCL) for molecular specific laser detection in liquid chromatography. Using a laser wavelength of 9.3721 μm glucose and fructose could be specifically detected and quantified in red wine in spite of the presence of organic acids. Publication V presents the development of an automated method for measuring the primary amino acid concentration in wines and musts by

  19. Air Pollution in Shanghai Studied by Nuclear Analysis Techniques

    International Nuclear Information System (INIS)

    In this paper PIXE, μ-PIXE, XAFS, Moessbauer effect and radioisotope labelling method are briefly introduced. Those methods were used to study the pollution of atmospheric particulate matter (PM) in Shanghai. The speciation of Cr, Mn, Cu, and Zn in the PM10 and PM2.5 and different character of vehicle exhausted particles from other emission sources were studied. Source apportionment of the atmospheric lead was calculated with a combined method of lead isotope ratios and lead mass balance, along with μ-PIXE analysis of single particles and pattern recognition of the spectra. The fabricated ultrafine particles to simulate aerosol particle was used to study the translocation from alveolus into circulation across the air blood barrier

  20. Analysis of myocardial infarction signals using optical technique.

    Science.gov (United States)

    Mahri, Nurhafizah; Gan, Kok Beng; Mohd Ali, Mohd Alauddin; Jaafar, Mohd Hasni; Meswari, Rusna

    2016-01-01

    The risk of heart attack or myocardial infarction (MI) may lead to serious consequences in mortality and morbidity. Current MI management in the triage includes non-invasive heart monitoring using an electrocardiogram (ECG) and the cardic biomarker test. This study is designed to explore the potential of photoplethysmography (PPG) as a simple non-invasive device as an alternative method to screen the MI subjects. This study emphasises the usage of second derivative photoplethysmography (SDPPG) intervals as the extracted features to classify the MI subjects. The statistical analysis shows the potential of "a-c" interval and the corrected "a-cC" interval to classify the subject. The sensitivity of the predicted model using "a-c" and "a-cC" is 90.6% and 81.2% and the specificity is 87.5% and 84.4%, respectively. PMID:27010162