WorldWideScience

Sample records for analysis techniques progress

  1. Progress of neutron induced prompt gamma analysis technique in 1988~2003

    Institute of Scientific and Technical Information of China (English)

    JING Shi-Wei; LIU Yu-Ren; CHI Yan-Tao; TIAN Yu-Bing; CAO Xi-Zheng; ZHAO Xin-Hui; REN Wan-Bin; LIU Lin-Mao

    2004-01-01

    This paper describes new development of the neutron induced prompt gamma-ray analysis (NIPGA) technology in 1988~2003. The pulse fast-thermal neutron activation analysis method, which utilized the inelastic re action and capture reaction jointly, was employed to measure the elemental contents more efficiently. Lifetime of the neutron generator was more than 10000h and the performance of detector and MCA reached a high level. At the same time, Monte Carlo library least-square method was used to solve the nonlinearity problem in the NIPGA.

  2. Progress in automation, robotics and measuring techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2015-01-01

    This book presents recent progresses in control, automation, robotics, and measuring techniques. It includes contributions of top experts in the fields, focused on both theory and industrial practice. The particular chapters present a deep analysis of a specific technical problem which is in general followed by a numerical analysis and simulation, and results of an implementation for the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be useful for both researchers working in the area of engineering sciences and for practitioners solving industrial problems.    .

  3. Progress in application of CFD techniques

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Computational Fluid Dynamics (CFD) is an important branch of fluid mechanics, and will continue to play great roles on the design of aerospace vehicles, explora- tion of new concept vehicles and new aerodynamic technology. This paper will present the progress of CFD from point of view of engineering application in recent years at CARDC, including the software integration, grid technique, speeding up of convergence, unsteady fluid computation,etc., and also give some engineering application examples of CFD at CARDC.

  4. [Idiopathic Progressive Subglottic Stenosis: Surgical Techniques].

    Science.gov (United States)

    Hoetzenecker, K; Schweiger, T; Klepetko, W

    2016-09-01

    Idiopathic subglottic stenosis is a disease characterized by slow, progressive scarring and constriction of the subglottic airway. It almost always occurs in females between the 3rd and 5th decade of life. Symptoms are frequently misinterpreted as asthma and patients are referred for endoscopic evaluation only when asthma medications fail to alleviate their symptoms. Treatment options can be divided into endoscopic and open surgical techniques. Microlaryngoscopic scar reduction by laser followed by balloon dilation usually delivers good short-term results. However, the majority of patients will experience restenosis within a short period of time. Open surgical correction techniques are based on a complete removal of the affected airway segment. This must be combined with various extended resection techniques in patients with advanced stenosis. Depending on the extent and severity of the stenosis the following surgical techniques are required: standard cricotracheal resection (Grillo's technique), cricoplasty with dorsal and lateral mucosaplasty, or a combination of resection and enlargement techniques using rib cartilage grafts. In experienced centres, success rates of over 95 % are reported with good functional outcome of voice and deglutition.

  5. Progress in application of CFD techniques

    Institute of Scientific and Technical Information of China (English)

    CHEN ZuoBin; JIANG Xiong; ZHOU Zhu; XIAO HanShan; HUANG Yong; MOU Bin; XIAO ZhongYun; LIU Gang; WANG YunTao

    2008-01-01

    Computational Fluid Dynamics (CFD) is an important branch of fluid mechanics,and will continue to play great roles on the design of aerospace vehicles,exploration of new concept vehicles and new aerodynamic technology.This paper will present the progress of CFD from point of view of engineering application in recent years at CARDC,including the software integration,grid technique,speeding up of convergence,unsteady fluid computation,etc.,and also give some engineering application examples of CFD at CARDC.

  6. [Progress in transgenic fish techniques and application].

    Science.gov (United States)

    Ye, Xing; Tian, Yuan-Yuan; Gao, Feng-Ying

    2011-05-01

    Transgenic technique provides a new way for fish breeding. Stable lines of growth hormone gene transfer carps, salmon and tilapia, as well as fluorescence protein gene transfer zebra fish and white cloud mountain minnow have been produced. The fast growth characteristic of GH gene transgenic fish will be of great importance to promote aquaculture production and economic efficiency. This paper summarized the progress in transgenic fish research and ecological assessments. Microinjection is still the most common used method, but often resulted in multi-site and multi-copies integration. Co-injection of transposon or meganuclease will greatly improve the efficiency of gene transfer and integration. "All fish" gene or "auto gene" should be considered to produce transgenic fish in order to eliminate misgiving on food safety and to benefit expression of the transferred gene. Environmental risk is the biggest obstacle for transgenic fish to be commercially applied. Data indicates that transgenic fish have inferior fitness compared with the traditional domestic fish. However, be-cause of the genotype-by-environment effects, it is difficult to extrapolate simple phenotypes to the complex ecological interactions that occur in nature based on the ecological consequences of the transgenic fish determined in the laboratory. It is critical to establish highly naturalized environments for acquiring reliable data that can be used to evaluate the environ-mental risk. Efficacious physical and biological containment strategies remain to be crucial approaches to ensure the safe application of transgenic fish technology.

  7. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  8. Progress involving new techniques for liposome preparation

    Directory of Open Access Journals (Sweden)

    Zhenjun Huang

    2014-08-01

    Full Text Available The article presents a review of new techniques being used for the preparation of liposomes. A total of 28 publications were examined. In addition to the theories, characteristics and problems associated with traditional methods, the advantages and drawbacks of the latest techniques were reviewed. In the light of developments in many relevant areas, a variety of new techniques are being used for liposome preparation and each of these new technique has particular advantages over conventional preparation methods. However, there are still some problems associated with these new techniques that could hinder their applications and further improvements are needed. Generally speaking, due to the introduction of these latest techniques, liposome preparation is now an improved procedure. These applications promote not only advances in liposome research but also the methods for their production on an industrial scale.

  9. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  10. The Progress on Laser Surface Modification Techniques of Titanium Alloy

    Institute of Scientific and Technical Information of China (English)

    LIANG Cheng; PAN Lin; Al Ding-fei; TAO Xi-qi; XIA Chun-huai; SONG Yan

    2004-01-01

    Titanium alloy is widely used in aviation, national defence, automobile, medicine and other fields because of their advantages in lower density, corrosion resistance, and fatigue resistance etc. As titanium alloy is higher friction coefficients, weak wear resistance, bad high temperature oxidation resistance and lower biocompatibility, its applications are restricted. Using laser surface modification techniques can significantly improve the surface properties of titanium alloy. a review is given for progress on laser surface modification techniques of titanium alloy in this paper.

  11. Research Progress on Technique of Frozen Embryo Transfer in Sheep

    Institute of Scientific and Technical Information of China (English)

    SHE Qiu-sheng; HU Jian-ye; LOU Peng-yan; TAO Jing; XIE Zhao-hui

    2011-01-01

    The paper introduced the research progress on the technique of frozen embryo transfer in sheep, illustrated selection of donors and receptors, superovulation, synchronization of estrus, embryo cryopreservation and embryo transplantation. Frozen embryo transfer in sheep is another breakthrough in the high-quality sheep raising, and this technique in China is in its infancy recommendation stage, but it will be comprehensively popularized in the future.

  12. DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Food scientists use standards and calibrations to relate the concentration of a compound of interest to the instrumental response. The techniques used include classical, single point, and inverse calibrations, as well as standard addition and internal standards. Several fundamental criteria -- sel...

  13. Research Progress of Immunological Technique in Analysis of Fingerprints' Contents%指纹中化学成分免疫分析技术热点研究

    Institute of Scientific and Technical Information of China (English)

    张婷; 杨瑞琴

    2015-01-01

    ABATRACT: Pores abounds in finger skin’s ridge, through which sweat is excreted and deposited on the surface of the skin. Sometimes, sweat contains special substances including drug and its metabolites. These substances in fingerprints can reflect a possibility of an individual’s ingesting drugs. In recent years, the importance of analyzing some excreted and deposited compounds in fingerprints has drawn more attention because these compounds may provide more significant information of an individual and his/her past behavior. Determination of fingerprints’ residues is helpful for criminal investigation and evidence identification. Many techniques such as FT-IR spectroscopy, infrared spectral imaging, Raman spectroscopy, Raman spectral imaging mass spectrometry, gas chromatography-mass spectrometry, high performance liquid chromatography, liquid chromatography-mass spectrometry, and immunological approach, have been widely used. Among these, the immunological approach can not only make latent fingerprints visualized but also deliver more accurate and sensitive biochemical information from fingerprints. In this article, the recent progress and application of immunological method for developing fingerprints are presented by focusing the determination on amino acid, ingested drug in fingerprints and aged latent fingerprints as well.%指纹上覆盖着一排排的汗孔,分泌有特殊化学成分,诸如药物及其代谢物等,则反映着人体的生活习惯和代谢规律。在案件侦查和司法鉴定中,可以通过分析指纹中化学成分对指纹遗留者的过往行为进行刻画。指纹中化学成分的检测可以采用多种分析技术,主要包括红外光谱及红外光谱成像技术、拉曼光谱及拉曼成像技术、质谱技术、气相色谱法、液相色谱法以及色谱-质谱联用分析技术、免疫分析技术等,其中免疫分析技术是通过制备抗体并与纳米材料相结合,这种功能化的免疫

  14. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  15. Digital Fourier analysis advanced techniques

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to advanced digital Fourier analysis for advanced undergraduate and graduate students. Assuming knowledge of the Fast Fourier Transform, this book covers advanced topics including the Hilbert transform, cepstrum analysis, and the two-dimensional Fourier transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Advanced Techniques" includes practice problems and thorough Appendices. As a central feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. The applet source code in Visual Basic is provided online, enabling advanced students to tweak and change the programs for more sophisticated results. A complete, intuitive guide, "Digital Fourier Analysis - Advanced Techniques" is an essential reference for students in science and engineering.

  16. [Research progress on techniques for artificial propagation of corals].

    Science.gov (United States)

    Wang, Shu-hong; Hong, Wen-ting; Chen, Ji-xin; Chen, Yun; Wang, Yi-lei; Zhang, Zi-ping; Weng, Zhao-hong; Xie, Yang-jie

    2015-09-01

    The natural coral reef resources degrade rapidly because of climate change, environmental pollution and exploitation of aquarium species. Artificial propagation is an effective way to facilitate the reduction of wild harvesting, reef restoration, preservation of biodiversity. This paper reviewed the technique and research progresses focused on coral artificial propagation. We compared the advantages and disadvantages of sexual reproduction and asexual reproduction as well as in situ and ex situ propagation. Moreover, we summarized the important roles of irradiation, flow rate, nutrients, feed and other factors in coral propagation within recirculating aquaculture system (RAS). Irradiation is the key to successful ex situ coral culture and different species show different needs of radiation intensity and light spectrum. Therefore, artificial lighting in RAS, as well as. power and maintenance costs, are very important for ex situ coral aquaculture. In addition, corals are very sensitive to NH4+, NO3-, NO2- as well as phosphate in RAS, and many physical, chemical and biological methods are acquired to maintain low nutrients condition. Although RAS has progressed a lot in terms of irradiation, flow rate and nutrient control, future studies also should focus on sexual reproduction, genetic modification and disease control. PMID:26785577

  17. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  18. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  19. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    2009-01-01

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods: Pati

  20. Key Techniques and Application Progress of Molecular Pharmacognosy

    Institute of Scientific and Technical Information of China (English)

    XIAO Xue-feng; HU Jing; XU Hai-yu; GAO Wen-yuan; ZHANG Tie-jun; LIU Chang-xiao

    2011-01-01

    At the boundary between pharmacognosy and molecular biology, molecular pharmacognosy has developed as a new borderline discipline. This paper reviews the methods, application, and prospect of molecular pharmacognosy. DNA marker is one of genetic markers and some molecular marker methods which have been successfully used for genetic diversity identification and new medicinal resources development. Recombinant DNA technology provides a powerful tool that enables scientists to engineer DNA sequences. Gene chip technique could be used in determination of gene expression profiles, analyses of polymorphisms, construction of genomic library, analysis of mapping, and sequencing by hybridization. Using the methods and theory of molecular biology and pharmacognosy, molecular pharmacognosy represents an extremely prospective branch of pharmacognosy and focuses on the study of systemic growth of medicinal plants, identification and evaluation of germplasm resources, plant metabolomics and production of active compounds. Furthermore, the great breakthrough of molecular pharmacognosy could be anticipated on DNA fingerprint analysis, cultivar improvement, DNA identification, and a global DNA barcoding system in the future.

  1. Hollow Rotor Progressing Cavity Pump Technique for Oil Production

    Institute of Scientific and Technical Information of China (English)

    Cao Gang

    2002-01-01

    @@ Features of Hollow RotorProgressing Cavity Pump(HRPCP) (1) Keep the path forPCP well-flushing.Clean over the producing wells quickly without shutting off the wells. Heat loss is low while the efficiency is high.

  2. Progress in realistic LOCA analysis

    International Nuclear Information System (INIS)

    In 1988 the USNRC revised the ECCS rule contained in Appendix K and Section 50.46 of 10 CFR Part 50, which governs the analysis of the Loss Of Coolant Accident (LOCA). The revised regulation allows the use of realistic computer models to calculate the loss of coolant accident. In addition, the new regulation allows the use of high probability estimates of peak cladding temperature (PCT), rather than upper bound estimates. Prior to this modification, the regulations were a prescriptive set of rules which defined what assumptions must be made about the plant initial conditions and how various physical processes should be modeled. The resulting analyses were highly conservative in their prediction of the performance of the ECCS, and placed tight constraints on core power distributions, ECCS set points and functional requirements, and surveillance and testing. These restrictions, if relaxed, will allow for additional economy, flexibility, and in some cases, improved reliability and safety as well. For example, additional economy and operating flexibility can be achieved by implementing several available core and fuel rod designs to increase fuel discharge burnup and reduce neutron flux on the reactor vessel. The benefits of application of best estimate methods to LOCA analyses have typically been associated with reductions in fuel costs, resulting from optimized fuel designs, or increased revenue from power upratings. Fuel cost savings are relatively easy to quantify, and have been estimated at several millions of dollars per cycle for an individual plant. Best estimate methods are also likely to contribute significantly to reductions in O and M costs, although these reductions are more difficult to quantify. Examples of O and M cost reductions are: 1) Delaying equipment replacement. With best estimate methods, LOCA is no longer a factor in limiting power levels for plants with high tube plugging levels or degraded safety injection systems. If other requirements for

  3. Algorithms Design Techniques and Analysis

    CERN Document Server

    Alsuwaiyel, M H

    1999-01-01

    Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm desi

  4. Progressive Failure Analysis of Composite Stiffened Panels

    Science.gov (United States)

    Bednarcyk, Brett A.; Yarrington, Phillip W.; Collier, Craig S.; Arnold, Steven M.

    2006-01-01

    A new progressive failure analysis capability for stiffened composite panels has been developed based on the combination of the HyperSizer stiffened panel design/analysis/optimization software with the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC). MAC/GMC discretizes a composite material s microstructure into a number of subvolumes and solves for the stress and strain state in each while providing the homogenized composite properties as well. As a result, local failure criteria may be employed to predict local subvolume failure and the effects of these local failures on the overall composite response. When combined with HyperSizer, MAC/GMC is employed to represent the ply level composite material response within the laminates that constitute a stiffened panel. The effects of local subvolume failures can then be tracked as loading on the stiffened panel progresses. Sample progressive failure results are presented at both the composite laminate and the composite stiffened panel levels. Deformation and failure model predictions are compared with experimental data from the World Wide Failure Exercise for AS4/3501-6 graphite/epoxy laminates.

  5. The Analysis of Thematic Progression Patterns of English Advertisement

    Institute of Scientific and Technical Information of China (English)

    徐倩; 郭鸿雁

    2014-01-01

    Thematic Progression Patterns are the principal base for English advertisement analysis. Nowadays, it has attracted many experts in this field. Thematic Progression plays very important roles in the creation, development and establishment of English advertisement. This paper introduces four main types of Thematic Progression patterns and the analysis of English adver-tisement from Thematic Progression perspective.

  6. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  7. Investigation progress of imaging techniques monitoring stem cell therapy

    International Nuclear Information System (INIS)

    Recently stem cell therapy has showed potential clinical application in diabetes mellitus, cardiovascular diseases, malignant tumor and trauma. Efficient techniques of non-invasively monitoring stem cell transplants will accelerate the development of stem cell therapies. This paper briefly reviews the clinical practice of stem cell, in addition, makes a review of monitoring methods including magnetic resonance and radionuclide imaging which have been used in stem cell therapy. (authors)

  8. Recent Progress in Electrical Insulation Techniques for HTS Power Apparatus

    Science.gov (United States)

    Hayakawa, Naoki; Kojima, Hiroki; Hanai, Masahiro; Okubo, Hitoshi

    This paper describes the electrical insulation techniques at cryogenic temperatures, i.e. Cryodielectrics, for HTS power apparatus, e.g. HTS power transmission cables, transformers, fault current limiters and SMES. Breakdown and partial discharge characteristics are discussed for different electrical insulation configurations of LN2, sub-cooled LN2, solid, vacuum and their composite insulation systems. Dynamic and static insulation performances with and without taking account of quench in HTS materials are also introduced.

  9. The latest progress of fission track analysis

    International Nuclear Information System (INIS)

    Fission track analysis as a new nuclear track technique is based on fission track annealing in mineral and is used for oil and gas exploration successfully. The west part of China is the main exploration for oil and gas. The oil and gas basins there experienced much more complicated thermal history and higher paleotemperature. In order to apply fission track analysis to these basins, following work was be carried out: 1. The decomposition of grain age distribution of zircon fission tracks. 2. Study on thermal history of Ordos basin using zircon fission track analysis. 3. The fission track study on the Qiang Tang basin in tibet

  10. Analysis of dynamic conflicts by techniques of artificial intelligence

    OpenAIRE

    Shinar, Josef

    1989-01-01

    Dynamic conflicts exhibit differentiel game characteristics and their analysis by any method which disregards this feature may be, by definition, futile. Unfortunately, realistic conflicts may have an intricate information structure and a complex hierarchy which don't fit in the classical differential game formulation. Moreover, in many cases even well formulated differential games are not solvable. In the recent years great progress has been made in artificial intelligence techniques, put in...

  11. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  12. Progressive Damage Analysis of Bonded Composite Joints

    Science.gov (United States)

    Leone, Frank A., Jr.; Girolamo, Donato; Davila, Carlos G.

    2012-01-01

    The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented durable redundant joint. Both designs involve honeycomb sandwich structures with carbon/epoxy facesheets joined using adhesively bonded doublers.Progressive damage modeling allows for the prediction of the initiation and evolution of damage within a structure. For structures that include multiple material systems, such as the joint designs under consideration, the number of potential failure mechanisms that must be accounted for drastically increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, intraply matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The bonded joints were modeled using highly parametric, explicitly solved finite element models, with damage modeling implemented via custom user-written subroutines. Each ply was discretely meshed using three-dimensional solid elements. Layers of cohesive elements were included between each ply to account for the possibility of delaminations and were used to model the adhesive layers forming the joint. Good correlation with experimental results was achieved both in terms of load-displacement history and the predicted failure mechanism(s).

  13. Recent Progress in Synthesis Techniques of Microstrip Bandpass Filter

    Directory of Open Access Journals (Sweden)

    Navita Singh

    2012-03-01

    Full Text Available End-coupled resonator bandpass filters built in microstrip are investigated. The admittance inverter parameters of coupling gaps between resonant sections are deduced from experiment and bandpass filter design rules are developed. This allows easy filter synthesis from “prototype” low-pass designs. Design techniques which were formerly employed in the realization of waveguide and coaxial filters have been applied in the synthesis of strip-line filters having “maximally-flat” and Tchebycheff response characteristics. In this paper, Tchebycheff response characteristics considered for realizing the required circuit parameters in strip line and we would like to give a way to conceive, design bandpass filter for the X-bands and C-band at the frequencies 10.7GHz and 6.2 GHz respectively with three-pole end-coupled microstrip filters, whichdesigned filters for Radar and GSO satellites and which used the capacitive resonators and stepped impedance resonators for filter realization. Therefore, by extension, the RF/microwave applications can be referred to as communications, and other that explore the usage of frequency spectrums, some of these frequency spectrums are further divided into many frequency bands. The design and simulation are performed using 3D full wave electromagnetic simulator IE3D.

  14. Environmental Contaminants in Hospital Settings and Progress in Disinfecting Techniques

    Directory of Open Access Journals (Sweden)

    Gabriele Messina

    2013-01-01

    Full Text Available Medical devices, such as stethoscopes, and other objects found in hospital, such as computer keyboards and telephone handsets, may be reservoirs of bacteria for healthcare-associated infections. In this cross-over study involving an Italian teaching hospital we evaluated microbial contamination (total bacterial count (TBC at 36°C/22°C, Staphylococcus spp., moulds, Enterococcus spp., Pseudomonas spp., E. coli, total coliform bacteria, Acinetobacter spp., and Clostridium difficile of these devices before and after cleaning and differences in contamination between hospital units and between stethoscopes and keyboards plus handsets. We analysed 37 telephone handsets, 27 computer keyboards, and 35 stethoscopes, comparing their contamination in four hospital units. Wilcoxon signed-rank and Mann-Whitney tests were used. Before cleaning, many samples were positive for Staphylococcus spp. and coliforms. After cleaning, CFUs decreased to zero in most comparisons. The first aid unit had the highest and intensive care the lowest contamination (P<0.01. Keyboards and handsets had higher TBC at 22°C (P=0.046 and mould contamination (P=0.002 than stethoscopes. Healthcare professionals should disinfect stethoscopes and other possible sources of bacterial healthcare-associated infections. The cleaning technique used was effective in reducing bacterial contamination. Units with high patient turnover, such as first aid, should practise stricter hygiene.

  15. Poof Analysis: A technique for Concept Formation

    OpenAIRE

    Bundy, Alan

    1985-01-01

    We report the discovery of an unexpected connection between the invention of the concept of uniform convergence and the occurs check in the unification algorithm. This discovery suggests the invention of further interesting concepts in analysis and a technique for automated concept formation. Part of this technique has been implemented.The discovery arose as part of an attempt to understand the role of proof analysis in mathematical reasoning, so as to incorporate it into a computer program. ...

  16. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  17. Progress of the technique of coal microwave desulfurization

    Institute of Scientific and Technical Information of China (English)

    Xiuxiang Tao; Ning Xu; Maohua Xie; Longfei Tang

    2014-01-01

    With the advantages of its fast speed, effective and moderate controllable conditions, desulfurization of coal by microwave has become research focus in the field of clean coal technology. Coal is a homogeneous mixture which consists of various components with different dielectric properties, so their abilities to absorb microwaves are different, and the sulfur-containing components are better absorbers of microwave, which makes them can be selectively heated and reacted under microwave irradiation. There still remain controversies on the principle of microwave desulfurization at present, thermal effects or non-thermal effects. The point of thermal effects of microwave is mainly base on its characters of rapidly and selectly heating. While, in view of non-thermal effect, direct interactions between the microwave electromagnetic field and sulfur containing components are proposed. It is a fundamental problem to determine the dielectric properties of coal and the sulfur-containing components to reveal the interaction of microwave and sulfur-containing compounds. However, the test of dielectric property of coal is affected by many factors, which makes it difficult to measure dielectric properties accurately. In order to achieve better desulfurization effect, the researchers employ methods of adding chemical additives such as acid, alkali, oxidant, reductant, or changing the reaction atmosphere, or combining with other methods such as magnetic separation, ultrasonic and microorganism. Researchers in this field have also put forward several processes, and have obtained a number of patents. Obscurity of microwave desulfurization mechanism, uncertainties in qualitative and quantitative analysis of sulfur-containing functional groups in coal, and the lack of special microwave equipment have limited further development of microwave desulfurization technology.

  18. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  19. Analysis and comparation of animation techniques

    OpenAIRE

    Joštová, Barbora

    2015-01-01

    This thesis is focused on the analysis and comparison of animation techniques. In the theoretical part of the thesis I define key terms, the historical development and the basic principles of animation techniques. In the practical part I describe the comparison between classic and digital types of animation. Based on this research I chose the most suitable animations that are further used to verify my hypothesis. The provided hypothesis is the order of based on how demanding it is in terms of...

  20. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  1. Gold analysis by the gamma absorption technique.

    Science.gov (United States)

    Kurtoglu, Arzu; Tugrul, A Beril

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement. PMID:12485656

  2. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  3. Progress in phototaxis mechanism research and micromanipulation techniques of algae cells

    Institute of Scientific and Technical Information of China (English)

    WEN Chenglu; LI Heng; WANG Pengbo; LI Wei; ZHAO Jingquan

    2007-01-01

    Phototactic movement is a characteristic of some microorganisms' response to light environment. Most of the algae have dramatically phototactic responses, underlying the complicated biological, physical and photochemical mechanisms are involved. With the development of the micro/nano and sensor techniques, great progress has been made in the research of the algae phototaxis. This review article summarizes the progress made in the research on the functional phototactic structures, the mechanisms of photo-response process and photodynamics of phototaxis in algae, and describes the latest developed micro-tracking technique and micromanipulation technique.Moreover, based on our own research results, the potential correlation between the phototaxis and photosynthesis is discussed, and the directions for future research of the phototactic mechanism are proposed.

  4. Primary progressive aphasia : neuropsychological analysis and evolution

    OpenAIRE

    Maruta, Carolina Pires, 1985-

    2015-01-01

    Tese de doutoramento, Ciências Biomédicas (Neurociências), Universidade de Lisboa, Faculdade de Medicina, 2015 Frontotemporal lobar degeneration (FTLD) is the second leading cause of early-onset (< 65 years) dementia. Some of its forms may begin by isolated language deficits, which are known as Primary Progressive Aphasia (PPA). PPA is defined as the insidious onset and progressive loss of linguistic abilities in the absence of major deficits in other areas of cognition or in activities of...

  5. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  6. CONSUMER BEHAVIOR ANALYSIS BY GRAPH MINING TECHNIQUE

    OpenAIRE

    KATSUTOSHI YADA; HIROSHI MOTODA; TAKASHI WASHIO; ASUKA MIYAWAKI

    2006-01-01

    In this paper, we discuss how graph mining system is applied to sales transaction data so as to understand consumer behavior. First, existing research of consumer behavior analysis for sequential purchase pattern is reviewed. Then we propose to represent the complicated customer purchase behavior by a directed graph retaining temporal information in a purchase sequence and apply a graph mining technique to analyze the frequent occurring patterns. In this paper, we demonstrate through the case...

  7. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  8. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  9. Progress Testing: Critical Analysis and Suggested Practices

    Science.gov (United States)

    Albanese, Mark; Case, Susan M.

    2016-01-01

    Educators have long lamented the tendency of students to engage in rote memorization in preparation for tests rather than engaging in deep learning where they attempt to gain meaning from their studies. Rote memorization driven by objective exams has been termed a steering effect. Progress testing (PT), in which a comprehensive examination…

  10. Important progress on the use of isotope techniques and methods in catchment hydrology

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The use of isotope techniques and methods in catchment hydrology in the last 50 years has generated two major types of progress: (1) Assessment of the temporal variations of the major stocks and flows of water in catchments, from which the estimation of wa-ter residence times is introduced in this paper. (2) Assessment of catchment hydrologic processes, in which the interactions be-tween different waters, hydrographical separation, and bio-geochemical process are described by using isotopes tracers. Future progress on isotope techniques and methods in hydrology is toward the understanding of the hydrological process in large river basins. Much potential also waits realization in terms of how isotope information may be used to calibrate and test distributed rainfall-runoff models and regarding aid in the quantification of sustainable water resources management.

  11. Progress in nuclear measuring and experimental techniques by application of microelectronics. 1

    International Nuclear Information System (INIS)

    In the past decade considerable progress has been made in nuclear measuring and experimental techniques by developing position-sensitive detector systems and widely using integrated circuits and microcomputers for data acquisition and processing as well as for automation of measuring processes. In this report which will be published in three parts those developments are reviewed and demonstrated on selected examples. After briefly characterizing microelectronics, the use of microelectronic elements for radiation detectors is reviewed. (author)

  12. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  13. Systems analysis department annual progress report 1986

    International Nuclear Information System (INIS)

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1986. The activities may be classified as energy systems analysis and risk and reliability analysis. The report includes a list of staff members. (author)

  14. HELCATS - Heliospheric Cataloguing, Analysis and Techniques Service

    Science.gov (United States)

    Harrison, Richard; Davies, Jackie; Perry, Chris; Moestl, Christian; Rouillard, Alexis; Bothmer, Volker; Rodriguez, Luciano; Eastwood, Jonathan; Kilpua, Emilia; Gallagher, Peter

    2016-04-01

    Understanding the evolution of the solar wind is fundamental to advancing our knowledge of energy and mass transport in the solar system, rendering it crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of both transient (CMEs) and background (SIRs/CIRs) solar wind plasma structures, by enabling their direct and continuous observation out to 1 AU and beyond. The EU-funded FP7 HELCATS project combines European expertise in heliospheric imaging, built up in particular through lead involvement in NASA's STEREO mission, with expertise in solar and coronal imaging as well as in-situ and radio measurements of solar wind phenomena, in a programme of work that will enable a much wider exploitation and understanding of heliospheric imaging observations. With HELCATS, we are (1.) cataloguing transient and background solar wind structures imaged in the heliosphere by STEREO/HI, since launch in late October 2006 to date, including estimates of their kinematic properties based on a variety of established techniques and more speculative, approaches; (2.) evaluating these kinematic properties, and thereby the validity of these techniques, through comparison with solar source observations and in-situ measurements made at multiple points throughout the heliosphere; (3.) appraising the potential for initialising advanced numerical models based on these kinematic properties; (4.) assessing the complementarity of radio observations (in particular of Type II radio bursts and interplanetary scintillation) in combination with heliospheric imagery. We will, in this presentation, provide an overview of progress from the first 18 months of the HELCATS project.

  15. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  16. Progressive collapse analysis using updated models for alternate path analysis after a blast

    Science.gov (United States)

    Eskew, Edward; Jang, Shinae; Bertolaccini, Kelly

    2016-04-01

    Progressive collapse is of rising importance within the structural engineering community due to several recent cases. The alternate path method is a design technique to determine the ability of a structure to sustain the loss of a critical element, or elements, and still resist progressive collapse. However, the alternate path method only considers the removal of the critical elements. In the event of a blast, significant damage may occur to nearby members not included in the alternate path design scenarios. To achieve an accurate assessment of the current condition of the structure after a blast or other extreme event, it may be necessary to reduce the strength or remove additional elements beyond the critical members designated in the alternate path design method. In this paper, a rapid model updating technique utilizing vibration measurements is used to update the structural model to represent the real-time condition of the structure after a blast occurs. Based upon the updated model, damaged elements will either have their strength reduced, or will be removed from the simulation. The alternate path analysis will then be performed, but only utilizing the updated structural model instead of numerous scenarios. After the analysis, the simulated response from the analysis will be compared to failure conditions to determine the buildings post-event condition. This method has the ability to incorporate damage to noncritical members into the analysis. This paper will utilize numerical simulations based upon a unified facilities criteria (UFC) example structure subjected to an equivalent blast to validate the methodology.

  17. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  18. Analysis of breast cancer progression using principal component analysis and clustering

    Indian Academy of Sciences (India)

    G Alexe; G S Dalgin; S Ganesan; C DeLisi; G Bhanot

    2007-08-01

    We develop a new technique to analyse microarray data which uses a combination of principal components analysis and consensus ensemble -clustering to find robust clusters and gene markers in the data. We apply our method to a public microarray breast cancer dataset which has expression levels of genes in normal samples as well as in three pathological stages of disease; namely, atypical ductal hyperplasia or ADH, ductal carcinoma in situ or DCIS and invasive ductal carcinoma or IDC. Our method averages over clustering techniques and data perturbation to find stable, robust clusters and gene markers. We identify the clusters and their pathways with distinct subtypes of breast cancer (Luminal, Basal and Her2+). We confirm that the cancer phenotype develops early (in early hyperplasia or ADH stage) and find from our analysis that each subtype progresses from ADH to DCIS to IDC along its own specific pathway, as if each was a distinct disease.

  19. Risk Analysis Group annual progress report 1984

    International Nuclear Information System (INIS)

    The activities of the Risk Analysis Group at Risoe during 1984 are presented. These include descriptions in some detail of work on general development topics and risk analysis performed as contractor. (author)

  20. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  1. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  2. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  3. Systems Analysis department. Annual progress report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Petersen, Kurt E.

    1998-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1997. The department is undertaking research within Energy systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability and Man/Machine Interaction. The report includes lists of publications lectures, committees and staff members. (au) 110 refs.

  4. Systems Analysis Department annual progress report 1998

    DEFF Research Database (Denmark)

    1999-01-01

    The report describes the work of the Systems Analysis Department at Risø National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, IndustrialSafety and Reliability, Man/Machine...

  5. Systems Analysis Department. Annual Progress Report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    2000-03-01

    This report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1999. The department is undertaking research within Energy Systems Analysis, Energy, Environment and Development Planning-UNEP Centre, Safety, Reliability and Human Factors, and Technology Scenarios. The report includes summary statistics and lists of publications, committees and staff members. (au)

  6. Systems Analysis Department annual progress report 1998

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    1999-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability, Man/Machine Interaction and Technology Scenarios. The report includes lists of publications, lectures, committees and staff members. (au) 111 refs.

  7. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  8. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  9. Bone feature analysis using image processing techniques.

    Science.gov (United States)

    Liu, Z Q; Austin, T; Thomas, C D; Clement, J G

    1996-01-01

    In order to establish the correlation between bone structure and age, and information about age-related bone changes, it is necessary to study microstructural features of human bone. Traditionally, in bone biology and forensic science, the analysis if bone cross-sections has been carried out manually. Such a process is known to be slow, inefficient and prone to human error. Consequently, the results obtained so far have been unreliable. In this paper we present a new approach to quantitative analysis of cross-sections of human bones using digital image processing techniques. We demonstrate that such a system is able to extract various bone features consistently and is capable of providing more reliable data and statistics for bones. Consequently, we will be able to correlate features of bone microstructure with age and possibly also with age related bone diseases such as osteoporosis. The development of knowledge-based computer vision-systems for automated bone image analysis can now be considered feasible.

  10. Progress in ALCHEMI analysis of crystal structure

    International Nuclear Information System (INIS)

    The atomic location by channeling-enhanced microanalysis (ALCHEMI) is an effective technique to clarify the atomic configuration in multi-component compounds. Recent development of the theory on the characteristic x-ray emission has made ALCHEMI a more reliable and expansive technique. On this revised ALCHEMI, the characteristic x-ray intensities are measured at various electron-incidence directions in a transmission electron microscope, and are compared with x-ray intensities calculated from dynamical electron diffraction and inelastic scattering theories. On the present work, this technique was applied to thermoelectric materials. The occupation probabilities of Mn atoms on Fe I and Fe II sites in a thermoelectric semiconductor Fe0.97Mn0.03Si2 of a β-FeSi2 structure were 0.434 and 0.574, respectively. As another example, the occupancy of Ce atoms on voids and the coordinates (z1, z2) of Sb atoms in Ce0.5Fe3NiSb12 of a skutterrudite CoSb3 structure was determined to be 0.33 and (z1=0.336, z2=0.147), respectively. (Y.K.)

  11. Liver Ultrasound Image Analysis using Enhancement Techniques

    Directory of Open Access Journals (Sweden)

    Smriti Sahu, Maheedhar Dubey, Mohammad Imroze Khan

    2012-12-01

    Full Text Available Liver cancer is the sixth most common malignanttumour and the third most common cause ofcancer-related deaths worldwide. Chronic Liverdamage affects up to 20% of our population. It hasmany causes - viral infections (Hepatitis B and C,toxins, genetic, metabolic and autoimmune diseases.The rate of liver cancer in Australia has increasedfour-fold in the past 20 years. For detection andqualitative diagnosis of liver diseases, Ultrasound(US image is an easy-to-use and minimally invasiveimaging modality. Medical images are oftendeteriorated by noise due to various sources ofinterferences and other phenomena known asSpeckle noise. Therefore it is required to apply somedigital image processing techniques for smoothingor suppression of speckle noise in ultrasoundimages. This paper attempts to undertake the studythree types of the image enhancement techniquesincluding, Shock Filter, Contrast Limited AdaptiveHistogram Equalization (CLAHE and Spatialfilter. These smoothing techniques are comparedusing performance matrices Peak Signal to NoiseRatio (PSNR and Mean Square Error (MSE. Ithas been observed that the Spatial high pass filtergives the better performance than others for liverultrasound image analysis.

  12. Window technique for climate trend analysis

    Science.gov (United States)

    Szentimrey, Tamás; Faragó, Tibor; Szalai, Sándor

    1992-01-01

    Climatic characteristics are affected by various systematic and occasional impacts: besides the changes in the observing system (locations of the stations of the meteorological network, instruments, observing procedures), the possible local-scale and global natural and antropogenic impacts on climatic conditions should be taken into account. Apart from the predictability problems, the phenomenological analysis of the climatic variability and the determination of past persistent climatic anomalies are significant problems, among other aspects, as evidence of the possible anomalous behavior of climate or for climate impact studies. In this paper, a special technique for the identification of such “shifts” in the observational series is presented. The existence of these significant shorter or longer term changes in the mean characteristics for the properly selected adjoining periods of time is the necessary condition for the formation of any more or less unidirectional climatic trends. Actually, the window technique is based on a complete set of orthogonal functions. The sensitivity of the proposed model on its main parameters is also investigated. This method is applied for hemispheric and Hungarian data series of the mean annual surface temperature.

  13. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters (125I, 57Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  14. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  15. Progress in CTEQ-TEA PDF analysis

    CERN Document Server

    Nadolsky, Pavel; Guzzi, Marco; Huston, Joey; Lai, Hung-Liang; Li, Zhao; Pumplin, Jon; Stump, Dan; Yuan, C -P

    2012-01-01

    Recent developments in the CTEQ-TEA global QCD analysis are presented. The parton distribution functions CT10-NNLO are described, constructed by comparing data from many experiments to NNLO approximations of QCD.

  16. Development and application of the electrochemical etching technique. Annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    1980-08-01

    This annual progress report documents further advances in the development and application of electrochemical etching of polycarbonate foils (ECEPF) for fast, intermediate, and thermal neutron dosimetry as well as alpha particle dosimetry. The fast (> 1.1 MeV) and thermal neutron dosimetry techniques were applied to a thorough investigation of the neutron contamination inherent in and about the primary x-ray beam of several medical therapy electron accelerators. Because of the small size of ECEPF dosimeters in comparison to other neutron meters, they have an unusually low perturbation of the radiation field under measurement. Due to this small size and the increased sensitivity of the ECEPF dosimeter over current techniques of measuring neutrons in a high photon field, the fast neutron contamination in the primary x-ray beam of all the investigated accelerators was measured with precision and found to be greater than that suggested by the other, more common, neutron dosimetry methods.

  17. Development and application of the electrochemical etching technique. Annual progress report

    International Nuclear Information System (INIS)

    This annual progress report documents further advances in the development and application of electrochemical etching of polycarbonate foils (ECEPF) for fast, intermediate, and thermal neutron dosimetry as well as alpha particle dosimetry. The fast (> 1.1 MeV) and thermal neutron dosimetry techniques were applied to a thorough investigation of the neutron contamination inherent in and about the primary x-ray beam of several medical therapy electron accelerators. Because of the small size of ECEPF dosimeters in comparison to other neutron meters, they have an unusually low perturbation of the radiation field under measurement. Due to this small size and the increased sensitivity of the ECEPF dosimeter over current techniques of measuring neutrons in a high photon field, the fast neutron contamination in the primary x-ray beam of all the investigated accelerators was measured with precision and found to be greater than that suggested by the other, more common, neutron dosimetry methods

  18. Randomization techniques for the intensity modulation-based quantum stream cipher and progress of experiment

    Science.gov (United States)

    Kato, Kentaro; Hirota, Osamu

    2011-08-01

    The quantum noise based direct encryption protocol Y-OO is expected to provide physical complexity based security, which is thought to be comparable to information theoretic security in mathematical cryptography, for the. physical layer of fiber-optic communication systems. So far, several randomization techniques for the quantum stream cipher by Y-OO protocol have been proposed, but most of them were developed under the assumption that phase shift keying is used as the modulation format. On the other hand, the recent progress in the experimental study on the intensity modulation based quantum stream cipher by Y-OO protocol raises expectations for its realization. The purpose of this paper is to present design and implementation methods of a composite model of the intensity modulation based quantum stream cipher with some randomization techniques. As a result this paper gives a viewpoint of how the Y-OO cryptosystem is miniaturized.

  19. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  20. Systems Analysis Department. Annual progress report 1996

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, H.; Olsson, C.; Petersen, K.E. [eds.

    1997-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1996. The department is undertaking research within Simulation and Optimisation of Energy Systems, Energy and Environment in Developing Countries - UNEP Centre, Integrated Environmental and Risk Management and Man/Machine Interaction. The report includes lists of publications, lectures, committees and staff members. (au) 131 refs.

  1. Dynamics and vibrations progress in nonlinear analysis

    CERN Document Server

    Kachapi, Seyed Habibollah Hashemi

    2014-01-01

    Dynamical and vibratory systems are basically an application of mathematics and applied sciences to the solution of real world problems. Before being able to solve real world problems, it is necessary to carefully study dynamical and vibratory systems and solve all available problems in case of linear and nonlinear equations using analytical and numerical methods. It is of great importance to study nonlinearity in dynamics and vibration; because almost all applied processes act nonlinearly, and on the other hand, nonlinear analysis of complex systems is one of the most important and complicated tasks, especially in engineering and applied sciences problems. There are probably a handful of books on nonlinear dynamics and vibrations analysis. Some of these books are written at a fundamental level that may not meet ambitious engineering program requirements. Others are specialized in certain fields of oscillatory systems, including modeling and simulations. In this book, we attempt to strike a balance between th...

  2. Flood Progression Modelling and Impact Analysis

    DEFF Research Database (Denmark)

    Mioc, Darka; Anton, François; Nickerson, B.;

    People living in the lower valley of the St. John River, New Brunswick, Canada, frequently experience flooding when the river overflows its banks during spring ice melt and rain. To better prepare the population of New Brunswick for extreme flooding, we developed a new flood prediction model that...... not be familiar with GIS analytical tools like Query Languages, can still understand technical discussions on flood analysis through the use of 3D models, which are close to reality....

  3. Progress on the CWU READI Analysis Center

    Science.gov (United States)

    Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C.

    2015-12-01

    Real-time GPS position streams are desirable for a variety of seismic monitoring and hazard mitigation applications. We report on progress in our development of a comprehensive real-time GPS-based seismic monitoring system for the Cascadia subduction zone. This system is based on 1 Hz point position estimates computed in the ITRF08 reference frame. Convergence from phase and range observables to point position estimates is accelerated using a Kalman filter based, on-line stream editor that produces independent estimations of carrier phase integer biases and other parameters. Positions are then estimated using a short-arc approach and algorithms from JPL's GIPSY-OASIS software with satellite clock and orbit products from the International GNSS Service (IGS). The resulting positions show typical RMS scatter of 2.5 cm in the horizontal and 5 cm in the vertical with latencies below 2 seconds. To facilitate the use of these point position streams for applications such as seismic monitoring, we broadcast real-time positions and covariances using custom-built aggregation-distribution software based on RabbitMQ messaging platform. This software is capable of buffering 24-hour streams for hundreds of stations and providing them through a REST-ful web interface. To demonstrate the power of this approach, we have developed a Java-based front-end that provides a real-time visual display of time-series, displacement vector fields, and map-view, contoured, peak ground displacement. This Java-based front-end is available for download through the PANGA website. We are currently analyzing 80 PBO and PANGA stations along the Cascadia margin and gearing up to process all 400+ real-time stations that are operating in the Pacific Northwest, many of which are currently telemetered in real-time to CWU. These will serve as milestones towards our over-arching goal of extending our processing to include all of the available real-time streams from the Pacific rim. In addition, we have

  4. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  5. Risk factors for progressive ischemic stroke A retrospective analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    BACKGROUND: Progressive ischemic stroke has higher fatality rate and disability rate than common cerebral infarction, thus it is very significant to investigate the early predicting factors related to the occurrence of progressive ischemic stroke, thc potential pathological mechanism and the risk factors of early intervention for preventing the occurrence of progressive ischemic stroke and ameliorating its outcome.OBJECTIVE: To analyze the possible related risk factors in patients with progressive ishcemic stroke, so as to provide reference for the prevention and treatment of progressive ishcemic stroke.DESIGN: A retrospective analysis.SETTING: Department of Neurology, General Hospital of Beijing Coal Mining Group.PARTICIPANTS: Totally 280 patients with progressive ischemic stroke were selected from the Department of Neurology, General Hospital of Beijing Coal Mining Group from March 2002 to June 2006, including 192 males and 88 females, with a mean age of (62±7) years old. They were all accorded with the diagnostic standards for cerebral infarction set by the Fourth National Academic Meeting for Cerebrovascular Disease in 1995, and confired by CT or MRI, admitted within 24 hours after attack, and the neurological defect progressed gradually or aggravated in gradients within 72 hours after attack, and the aggravation of neurological defect was defined as the neurological deficit score decreased by more than 2 points. Meanwhile,200 inpatients with non-progressive ischemic stroke (135 males and 65 females) were selected as the control group.METHODS: After admission, a univariate analysis of variance was conducted using the factors of blood pressure, history of diabetes mellitus, fever, leukocytosis, levels of blood lipids, fibrinogen, blood glucose and plasma homocysteine, cerebral arterial stenosis, and CT symptoms of early infarction, and the significant factors were involved in the multivariate non-conditional Logistic regression analysis.MAIN OUTCOME MEASURES

  6. Organic analysis progress report FY 1997

    Energy Technology Data Exchange (ETDEWEB)

    Clauss, S.A.; Grant, K.E.; Hoopes, V.; Mong, G.M.; Steele, R.; Bellofatto, D.; Sharma, A.

    1998-04-01

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples.

  7. Organic analysis progress report FY 1997

    International Nuclear Information System (INIS)

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples

  8. Progress in the biosensing techniques for trace-level heavy metals.

    Science.gov (United States)

    Mehta, Jyotsana; Bhardwaj, Sanjeev K; Bhardwaj, Neha; Paul, A K; Kumar, Pawan; Kim, Ki-Hyun; Deep, Akash

    2016-01-01

    Diverse classes of sensors have been developed over the past few decades for on-site detections of heavy metals. Most of these sensor systems have exploited optical, electrochemical, piezoelectric, ion-selective (electrode), and electrochemical measurement techniques. As such, numerous efforts have been made to explore the role of biosensors in the detection of heavy metals based on well-known interactions between heavy metals and biomolecules (e.g. proteins, peptides, enzymes, antibodies, whole cells, and nucleic acids). In this review, we cover the recent progress made on different types of biosensors for the detection of heavy metals. Our major focus was examining the use of biomolecules for constructing these biosensors. The discussion is extended further to cover the biosensors' performance along with challenges and opportunities for practical utilization.

  9. Progress of Space Charge Research on Oil-Paper Insulation Using Pulsed Electroacoustic Techniques

    Directory of Open Access Journals (Sweden)

    Chao Tang

    2016-01-01

    Full Text Available This paper focuses on the space charge behavior in oil-paper insulation systems used in power transformers. It begins with the importance of understanding the space charge behavior in oil-paper insulation systems, followed by the introduction of the pulsed electrostatic technique (PEA. After that, the research progress on the space charge behavior of oil-paper insulation during the recent twenty years is critically reviewed. Some important aspects such as the environmental conditions and the acoustic wave recovery need to be addressed to acquire more accurate space charge measurement results. Some breakthroughs on the space charge behavior of oil-paper insulation materials by the research team at the University of Southampton are presented. Finally, future work on space charge measurement of oil-paper insulation materials is proposed.

  10. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  11. The critical barrier to progress in dentine bonding with the etch-and-rinse technique

    Science.gov (United States)

    Brackett, M.G.; Li, N.; Brackett, W.W.; Sword, R.J.; Qi, Y.P.; Niu, L.N.; Pucci, C.R.; Dib, A.; Pashley, D.H.; Tay, F.R.

    2011-01-01

    Objectives The lack of durability in resin–dentine bonds led to the use of chlorhexidine as MMP-inhibitor to prevent the degradation of hybrid layers. Biomimetic remineralisation is a concept-proven approach in preventing the degradation of resin–dentine bonds. The purpose of this study is to examine the integrity of aged resin–dentine interfaces created with a nanofiller-containing etch-and-rinse adhesive after the application of these two approaches. Methods The more established MMP-inhibition approach was examined using a parallel in vivo and in vitro ageing design to facilitate comparison with the biomimetic remineralisation approach using an in vitro ageing design. Specimens bonded without chlorhexidine exhibited extensive degradation of the hybrid layer after 12 months of in vivo ageing. Results Dissolution of nanofillers could be seen within a water-rich zone within the adhesive layer. Although specimens bonded with chlorhexidine exhibited intact hybrid layers, water-rich regions remained in those hybrid layers and degradation of nanofillers occurred within the adhesive layer. Specimens subjected to in vitro biomimetic remineralisation followed by in vitro ageing demonstrated intrafibrillar collagen remineralisation within hybrid layers and deposition of mineral nanocrystals in nanovoids within the adhesive. Conclusions The impact was realized by understanding the lack of an inherent mechanism to remove water from resin–dentine interfaces as the critical barrier to progress in bonding with the etch-and-rinse technique. The experimental biomimetic remineralisation strategy offers a creative solution for incorporating a progressive hydration mechanism to achieve this goal, which warrants its translation into a clinically applicable technique. PMID:21215788

  12. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  13. Progression of Stellar Intensity Interferometry techniques using 3 meter telescopes at StarBase-Utah

    Science.gov (United States)

    Matthews, Nolan; Kieda, Dave; Lebohec, Stephan

    2015-04-01

    The emergence of large air Cherenkov telescope arrays have opened up the potential for high-resolution imaging of stellar surfaces using Intensity Interferometry techniques. Stellar Intensity Interferometry (SII) allows coverage into the optical and ultraviolet frequency bands which are traditionally inaccessible to classical Michelson interferometry. The relative insensitivity to atmospheric turbulence allows for unprecedented angular resolution scales as the baselines between telescopes can be made very large (>100m) without the need for precise spatial resolution as required by Michelson interferometry. In this talk I will illustrate the science capabilities of the SII technique and describe the progress achieved in developing a modern Stellar Intensity Interferometry system with a pair of 3 meter diameter optical telescopes located at StarBase-Utah. In particular, I will discuss the current status of the StarBase-Utah observatory and present results from two telescope low frequency optical correlation observations of the optical Crab pulsar. These measurements provide a first step towards actual intensity interferometry observations and establish the working condition of the StarBase-Utah telescopes.

  14. [Research progress on urban carbon fluxes based on eddy covariance technique].

    Science.gov (United States)

    Liu, Min; Fu, Yu-Ling; Yang, Fang

    2014-02-01

    Land use change and fossil fuel consumption due to urbanization have made significant effect on global carbon cycle and climate change. Accurate estimating and understanding of the carbon budget and its characteristics are the premises for studying carbon cycle and its driving mechanisms in urban system. Based on the theory of eddy covariance (EC) technique, the characteristics atmospheric boundary layer and carbon cycle in urban area, this study systematically reviewed the principles of CO2 flux monitoring in urban system with EC technique, and then summarized the problems faced in urban CO2 flux monitoring and the method for data processing and further assessment. The main research processes on urban carbon fluxes with EC technique were also illustrated. The results showed that the urban surface was mostly acting as net carbon source. The CO2 exchange between urban surface and atmosphere showed obvious diurnal, weekly and seasonal variation resulted from the vehicle exhaust, domestic heating and vegetation respiration. However, there still exist great uncertainties in urban flux measurement and its explanation due to high spatial heterogeneity and complex distributions of carbon source/sink in urban environments. In the end, we suggested that further researches on EC technique and data assessment in complex urban area should be strengthened. It was also requisite to develop models of urban carbon cycle on the basis of the system principle, to investigate the influencing mechanism and variability of urban cycle at regional scale with spatial analysis technique.

  15. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  16. Analysis Methods for Progressive Damage of Composite Structures

    Science.gov (United States)

    Rose, Cheryl A.; Davila, Carlos G.; Leone, Frank A.

    2013-01-01

    This document provides an overview of recent accomplishments and lessons learned in the development of general progressive damage analysis methods for predicting the residual strength and life of composite structures. These developments are described within their State-of-the-Art (SoA) context and the associated technology barriers. The emphasis of the authors is on developing these analysis tools for application at the structural level. Hence, modeling of damage progression is undertaken at the mesoscale, where the plies of a laminate are represented as a homogenous orthotropic continuum. The aim of the present effort is establish the ranges of validity of available models, to identify technology barriers, and to establish the foundations of the future investigation efforts. Such are the necessary steps towards accurate and robust simulations that can replace some of the expensive and time-consuming "building block" tests that are currently required for the design and certification of aerospace structures.

  17. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  18. Hybrid chemical and nondestructive analysis technique

    International Nuclear Information System (INIS)

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  19. Hybrid chemical and nondestructive-analysis technique

    Energy Technology Data Exchange (ETDEWEB)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities.

  20. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  1. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    Science.gov (United States)

    Mirizzi, F.

    2014-02-01

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper.

  2. Recent Progresses in Nanobiosensing for Food Safety Analysis.

    Science.gov (United States)

    Yang, Tao; Huang, Huifen; Zhu, Fang; Lin, Qinlu; Zhang, Lin; Liu, Junwen

    2016-01-01

    With increasing adulteration, food safety analysis has become an important research field. Nanomaterials-based biosensing holds great potential in designing highly sensitive and selective detection strategies necessary for food safety analysis. This review summarizes various function types of nanomaterials, the methods of functionalization of nanomaterials, and recent (2014-present) progress in the design and development of nanobiosensing for the detection of food contaminants including pathogens, toxins, pesticides, antibiotics, metal contaminants, and other analytes, which are sub-classified according to various recognition methods of each analyte. The existing shortcomings and future perspectives of the rapidly growing field of nanobiosensing addressing food safety issues are also discussed briefly. PMID:27447636

  3. Multiuser detection and independent component analysis-Progress and perspective

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The latest progress in the multiuser detection and independent component analysis (ICA) is reviewed systematically. Then two novel classes of multiuser detection methods based on ICA algorithms and feedforward neural networks are proposed. Theoretical analysis and computer simulation show that ICA algorithms are effective to detect multiuser signals in code-division multiple-access (CDMA) system. The performances of these methods are not identical entirely in various channels, but all of them are robust, efficient, fast and suitable for real-time implementations.

  4. Recent Progresses in Nanobiosensing for Food Safety Analysis

    Science.gov (United States)

    Yang, Tao; Huang, Huifen; Zhu, Fang; Lin, Qinlu; Zhang, Lin; Liu, Junwen

    2016-01-01

    With increasing adulteration, food safety analysis has become an important research field. Nanomaterials-based biosensing holds great potential in designing highly sensitive and selective detection strategies necessary for food safety analysis. This review summarizes various function types of nanomaterials, the methods of functionalization of nanomaterials, and recent (2014–present) progress in the design and development of nanobiosensing for the detection of food contaminants including pathogens, toxins, pesticides, antibiotics, metal contaminants, and other analytes, which are sub-classified according to various recognition methods of each analyte. The existing shortcomings and future perspectives of the rapidly growing field of nanobiosensing addressing food safety issues are also discussed briefly. PMID:27447636

  5. [Research progresses of anabolic steroids analysis in doping control].

    Science.gov (United States)

    Long, Yuanyuan; Wang, Dingzhong; Li, Ke'an; Liu, Feng

    2008-07-01

    Anabolic steroids, a kind of physiological active substance, are widely abused to improve athletic performance in human sports. They have been forbidden in sports by the International Olympic Committee since 1983. Since then, many researchers have been focusing their attentions on the establishment of reliable detection methods. In this paper, we review the research progresses of different analytical methods for anabolic steroids since 2002, such as gas chromatography-mass spectrometry, liquid chromatography-mass spectrometry, immunoassay, electrochemistry analysis and mass spectrometry. The developing prospect of anabolic steroids analysis is also discussed.

  6. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  7. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  8. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  9. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  10. Orthokeratology to control myopia progression: a meta-analysis.

    Directory of Open Access Journals (Sweden)

    Yuan Sun

    Full Text Available To evaluate the clinical treatment effects of orthokeratology to slow the progression of myopia.Several well-designed controlled studies have investigated the effects of orthokeratology in school-aged children. We conducted this meta-analysis to better evaluate the existing evidence. Relevant studies were identified in the Medline and Embase database without language limitations. The main outcomes included axial length and vitreous chamber depth reported as the mean ± standard deviation. The results were pooled and assessed with a fixed-effects model analysis. Subgroup analyses were performed according to geographical location and study design.Of the seven eligible studies, all reported axial length changes after 2 years, while two studies reported vitreous chamber depth changes. The pooled estimates indicated that change in axial length in the ortho-k group was 0.27 mm (95% confidence interval [CI]: 0.22, 0.32 less than the control group. Myopic progression was reduced by approximately 45%. The combined results revealed that the difference in vitreous chamber depth between the two groups was 0.22 mm (95% confidence interval [CI]: 0.14, 0.31. None of the studies reported severe adverse events.The overall findings suggest that ortho-k can slow myopia progression in school-aged children.

  11. Psychoanalytic technique and 'analysis terminable and interminable'.

    Science.gov (United States)

    Sandler, J

    1988-01-01

    Some of the implications for psychoanalytic technique of the papers given at the plenary sessions of the Montreal Congress are considered. Emphasis is placed on the role of affects in development and in current psychic functioning. Motivation for unconscious wishes arises from many sources, and affects should not only be thought of as drive derivatives. There is a substantial gap between the (largely) implicit clinico-technical theories in the analytic work presented, which do in fact show great sensitivity to the patients' affects, and the formal 'official' general psychoanalytic theory used. This discrepancy in our theories should be faced. Freud's tripartite structural theory of the mind (the 'second topography') seems now to have limitations for clinical purposes. PMID:3063676

  12. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  13. Comparison of Commonly Used Accident Analysis Techniques for Manufacturing Industries

    Directory of Open Access Journals (Sweden)

    IRAJ MOHAMMADFAM

    2015-10-01

    Full Text Available The adverse consequences of major accident events have led to development of accident analysis techniques to investigate thoroughly the accidents. However, each technique has its own advantages and shortcomings,which make it very difficult to find a single technique being capable of analyzing all types of accidents. Therefore, the comparison of accident analysis techniques would help finding out their capabilities in different circumstances to choose the most one. In this research, the techniques CBA and AABF were compared with Tripod β in order to determine the superior technique for analysis of major accidents in manufacturing industries. At first step, the comparison criteria were developed using Delphi Method. Afterwards, the relative importance of each criterion was qualitatively determined and the qualitative values were then converted to the quantitative values  applying  Fuzzy  triangular  numbers.  Finally,  the  TOPSIS  was  used  to  prioritize  the techniques in terms of the preset criteria. The results of the study showed that Tripod β is superior to the CBA and AABF. It is highly recommended to compare all available accident analysis techniques based on proper criteria in order to select the best one whereas improper choice of accident analysis techniques may lead to misguided results.

  14. Recent progress of surface analysis (AES, XPS, and TOF-SIMS) and their application to corrosion analysis

    International Nuclear Information System (INIS)

    Auger Electron Spectroscopy (AES), X-ray Photoelectron Spectroscopy (XPS), and Time-of-Flight Secondary Ion Mass Spectrometry (TOF-SIMS) are surface analysis techniques which provide atomic- and molecular-level surface chemical information. They are widely used for failure analysis, quality control, and research and development of advanced materials and devices. In this review, we overview the recent progress of the commercial apparatus, and also highlight their improved sensitivity and depth profiling capabilities. We also introduce their recent application in corrosion science. (author)

  15. Accelerator based techniques for aerosol analysis

    International Nuclear Information System (INIS)

    At the 3 MV Tandetron accelerator of the LABEC laboratory of INFN (Florence, Italy) an external beam facility is fully dedicated to PIXE-PIGE measurements of elemental composition of atmospheric aerosols. Examples regarding recent monitoring campaigns, performed in urban and remote areas, both on a daily basis and with high time resolution, as well as with size selection, will be presented. It will be evidenced how PIXE can provide unique information in aerosol studies or can play a complementary role to traditional chemical analysis. Finally a short presentation of 14C analysis of the atmospheric aerosol by Accelerator Mass Spectrometry (AMS) for the evaluation of the contributions from either fossil fuel combustion or modern sources (wood burning, biogenic activity) will be given. (author)

  16. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  17. DNA ANALYSIS OF RICIN USING RAPD TECHNIQUE

    OpenAIRE

    Martin Vivodík; Želmíra Balážová; Zdenka Gálová

    2014-01-01

    Castor (Ricinus communis L.) is an important plant for production of industrial oil. The systematic evaluation of the molecular diversity encompassed in castor inbreds or parental lines offers an efficient means of exploiting the heterosis in castor as well as for management of biodiversity. The aim of this work was to detect genetic variability among the set of 30 castor genotypes using 5 RAPD markers. Amplification of genomic DNA of 30 genotypes, using RAPD analysis, yielded 35 fragments, w...

  18. ANALYSIS AND COMPARATIVE STUDY OF SEARCHING TECHNIQUES

    OpenAIRE

    Yuvraj Singh Chandrawat*

    2015-01-01

    We live in the age of technolgy and it is quiet obvious that it is increasing day-by-day endlessly. In this technical era researchers are focusing on the development of the existing technologies. Software engineering is the dominant branch of Computer Science that deals with the development and analysis of the software. The objective of this study is to analyze and compare the existing searching algorithms (linear search and binary search). In this paper, we will discuss both thes...

  19. Multivariate techniques of analysis for ToF-E recoil spectrometry data

    International Nuclear Information System (INIS)

    Multivariate statistical methods are being developed by the Australian -Swedish Recoil Spectrometry Collaboration for quantitative analysis of the wealth of information in Time of Flight (ToF) and energy dispersive Recoil Spectrometry. An overview is presented of progress made in the use of multivariate techniques for energy calibration, separation of mass-overlapped signals and simulation of ToF-E data. 6 refs., 5 figs

  20. Multivariate techniques of analysis for ToF-E recoil spectrometry data

    Energy Technology Data Exchange (ETDEWEB)

    Whitlow, H.J.; Bouanani, M.E.; Persson, L.; Hult, M.; Jonsson, P.; Johnston, P.N. [Lund Institute of Technology, Solvegatan, (Sweden), Department of Nuclear Physics; Andersson, M. [Uppsala Univ. (Sweden). Dept. of Organic Chemistry; Ostling, M.; Zaring, C. [Royal institute of Technology, Electrum, Kista, (Sweden), Department of Electronics; Johnston, P.N.; Bubb, I.F.; Walker, B.R.; Stannard, W.B. [Royal Melbourne Inst. of Tech., VIC (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Multivariate statistical methods are being developed by the Australian -Swedish Recoil Spectrometry Collaboration for quantitative analysis of the wealth of information in Time of Flight (ToF) and energy dispersive Recoil Spectrometry. An overview is presented of progress made in the use of multivariate techniques for energy calibration, separation of mass-overlapped signals and simulation of ToF-E data. 6 refs., 5 figs.

  1. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  2. PROGRESS IN PROTEOME ANALYTICAL TECHNIQUES%蛋白质组分析技术进展

    Institute of Scientific and Technical Information of China (English)

    解建勋; 蒲小平; 李玉珍; 李长龄

    2001-01-01

    The proteome represents the protein pattern of a speciy,anorganism,a cell,an organelle,or even a body fluid determined quantitatively at a certain moment and under precisely defined limiting conditions.Proteome research techniques are important tools in the Post-Genome Era.Quantitative separation and analysis of proteins in the proteome involve many techniques,including sample preparation,two dimension(2D)gel electrophoresis, capillary electrophoresis,chromatographic techniques,mass spectrometry,and so on.The 2D gel electrophoresis is currently a quite good method which is available to provide enough space for several thousand components and separate protein mixtures with in a few hours.Combined use of various analysis techniques and automation in instrumentation will be the recent trend in this field.%蛋白质组是指某一物种、个体、器官、组织、细胞乃至体液在精确控制其环境条件之下,特定时刻的全部蛋白质表达图谱。继基因组之后,它的研究即将成为分子生物学的研究热点。蛋白质组研究中常用分离分析技术包括样品制备,双向凝胶电泳,毛细管电泳,色谱技术和质谱技术。双向凝胶电泳是在较短时间内分离大量蛋白质组分,提供足够分离空间的比较成熟的方法。各种分析技术的连用和分析过程的自动化将是蛋白质组研究技术的发展方向。

  3. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  4. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  5. Progressive Failure Analysis on the Single Lap Bonded Joints

    Directory of Open Access Journals (Sweden)

    Kadir TURAN

    2010-03-01

    Full Text Available In this study, the failure analysis on the single lap bonded joint, which is used for joined two composite plates each other with adhesive, is investigated experimentally and numerically. In the joint, the epoxy resin is used for adhesive and the four layered carbon fiber reinforced epoxy matrix resin composite plates are used for adherent. Numerical study is performed in the ANSYS software which is used finite element method for solution. For obtained numerical failure loads, the progressive failure analysis is used with material property degradation rules. In the failure analysis the Hashin Failure Criterion is used for composite plates and the Maximum Principal Stress failure criterion is used for adhesive. The effects of the adhesive thickness overlap lengths and plate weight on the joint strength is investigated with numerically. As a result it is seen that the failure loads is affected the bond face area. The results are presented with graphs and tables.

  6. Progress Toward the Analysis of the Kinetic Stabilizer Concept

    Energy Technology Data Exchange (ETDEWEB)

    Post, R F; Byers, J A; Cohen, R H; Fowler, T K; Ryutov, D D; Tung, L S

    2005-02-08

    The Kinetic Stabilizer (K-S) concept [1] represents a means for stabilizing axisymmetric mirror and tandem-mirror (T-M) magnetic fusion systems against MHD interchange instability modes. Magnetic fusion research has given us examples of axisymmetric mirror confinement devices in which radial transport rates approach the classical ''Spitzer'' level, i.e. situations in which turbulence if present at all, is at too low a level to adversely affect the radial transport [2,3,4]. If such a low-turbulence condition could be achieved in a T-M system it could lead to a fusion power system that would be simpler, smaller, and easier to develop than one based on closed-field confinement, e.g., the tokamak, where the transport is known to be dominated by turbulence. However, since conventional axisymmetric mirror systems suffer from the MHD interchange instability, the key to exploiting this new opportunity is to find a practical way to stabilize this mode. The K-S represents one avenue to achieving this goal. The starting point for the K-S concept is a theoretical analysis by Ryutov [5]. He showed that a MHD-unstable plasma contained in an axisymmetric mirror cell can be MHD-stabilized by the presence of a low-density plasma on the expanding field lines outside the mirrors. If this plasma communicates well electrically with the plasma in the then this exterior plasma can stabilize the interior, confined, plasma. This stabilization technique was conclusively demonstrated in the Gas Dynamic Trap (GDT) experiment [6] at Novosibirsk, Russia, at mirror-cell plasma beta values of 40 percent. The GDT operates in a high collisionality regime. Thus the effluent plasma leaking through the mirrors, though much lower in density than that of the confined plasma, is still high enough to satisfy the stabilization criterion. This would not, however, be the case in a fusion T-M with axisymmetric plug and central cell fields. In such a case the effluent plasma would be far

  7. Analysis and calibration techniques for superconducting resonators

    Science.gov (United States)

    Cataldo, Giuseppe; Wollack, Edward J.; Barrentine, Emily M.; Brown, Ari D.; Moseley, S. Harvey; U-Yen, Kongpop

    2015-01-01

    A method is proposed and experimentally explored for in-situ calibration of complex transmission data for superconducting microwave resonators. This cryogenic calibration method accounts for the instrumental transmission response between the vector network analyzer reference plane and the device calibration plane. Once calibrated, the observed resonator response is analyzed in detail by two approaches. The first, a phenomenological model based on physically realizable rational functions, enables the extraction of multiple resonance frequencies and widths for coupled resonators without explicit specification of the circuit network. In the second, an ABCD-matrix representation for the distributed transmission line circuit is used to model the observed response from the characteristic impedance and propagation constant. When used in conjunction with electromagnetic simulations, the kinetic inductance fraction can be determined with this method with an accuracy of 2%. Datasets for superconducting microstrip and coplanar-waveguide resonator devices were investigated and a recovery within 1% of the observed complex transmission amplitude was achieved with both analysis approaches. The experimental configuration used in microwave characterization of the devices and self-consistent constraints for the electromagnetic constitutive relations for parameter extraction are also presented.

  8. Silicon ribbon growth by a capillary action shaping technique. Annual report (Quarterly technical progress report No. 9)

    Energy Technology Data Exchange (ETDEWEB)

    Schwuttke, G.H.; Ciszek, T.F.; Kran, A.

    1977-10-01

    Progress on the technological and economical assessment of ribbon growth of silicon by a capillary action shaping technique is reported. Progress in scale-up of the process from 50 mm to 100 mm ribbon widths is presented, the use of vitreous carbon as a crucible material is analyzed, and preliminary tests of CVD Si/sub 3/N/sub 4/ as a potential die material are reported. Diffusion length measurements by SEM, equipment and procedure for defect display under MOS structure in silicon ribbon for lifetime interpretation, and an assessment of ribbon technology are discussed. (WHK)

  9. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically. The effici...

  10. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  11. Methodological progresses in Markovian availability analysis and applications

    International Nuclear Information System (INIS)

    The Markovian model applied to reliability analysis is well known as an effective tool, whenever some dependencies affect the probabilistic behaviour of system's components. Its ability to study the dynamical evolution of systems allows to include human actions into the temporal evolution (inspections, maintenances, including human failure probabilities). The starting point has been the Sstagen-Mmarela code. In spite of the fact that this code already realizes much progresses towards reducing the size of markovian matrices (merging of Markov processes of systems exhibiting symmetries), there is still an imperative need to reduce memory requirements. This implies, as a first step of any realistic analysis, a modularization of the studied system into subsystems, which could be 'coupled'. The methodology is applied to the auxiliary feedwater injection of Doel 3. (orig./HSCH)

  12. Statistical Analysis of the Progressive Failure Behavior for Fiber-Reinforced Polymer Composites under Tensile Loading

    Directory of Open Access Journals (Sweden)

    Fang Wang

    2014-01-01

    Full Text Available An analytical approach with the help of numerical simulations based on the equivalent constraint model (ECM was proposed to investigate the progressive failure behavior of symmetric fiber-reinforced composite laminates damaged by transverse ply cracking. A fracture criterion was developed to describe the initiation and propagation of the transverse ply cracking. This work was also concerned with a statistical distributions of the critical fracture toughness values with due consideration given to the scale size effect. The Monte Carlo simulation technique coupled with statistical analysis was applied to study the progressive cracking behaviors of composite structures, by considering the effects of lamina properties and lay-up configurations. The results deduced from the numerical procedure were in good agreement with the experimental results obtained for laminated composites formed by unidirectional fiber reinforced laminae with different orientations.

  13. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical tec

  14. The San Pedro Mártir Open Cluster Survey: Progress, Techniques, Preliminary Results

    Science.gov (United States)

    Schuster, W.; Michel, R.; Dias, W.; Tapia-Peralta, T.; Vázquez, R.; Macfarland, J.; Chavarría, C.; Santos, C.; Moitinho, A.

    2007-05-01

    A CCD UBVRI survey of northern open clusters is being undertaken at San Pedro Mártir, Mexico, and performed using always the same instrumental setup (telescope, CCD, filters), reduction methods, and system of standards (Landolt). To date more than 300 clusters (mostly unstudied previously) have been observed, and about half the data reduced using aperture-photometry and PSF techniques. Our analysis procedures are being refined by studying in detail a small subset of these clusters. For example, the heavily reddened clusters Be80 and Be95 are being examined in the color-color diagrams: (B-V,U-B) and (B-V,R-I) to better understand the problems of curvature and variable reddening. For clusters for which our U data reaches the F-type stars, such as NGC2192 and NGC7296, techniques are being examined for estimating both the reddening E(B-V) and metallicity [Fe/H] via the use of the (U-B) excess. If the clusters also have "red clump" stars, such as NGC1798 and Do02, these procedures can be iterated between the clump and main sequence stars to establish even better the values of E(B-V) and [Fe/H]. Finally, color-magnitude diagrams, such as (B-V,V) and (V-I,V), are being employed together with the Schmidt-Kaler colors and Padova isochrones to obtain distances and ages for these clusters. A java-based computer program is being developed to help in the visualization and analysis of these photometric data. This system is capable of displaying each cluster simultaneously in different color-color and color-magnitude diagrams and has an interactive way to identify a star, or group of stars, in one diagram and to see were it falls in the other diagrams, facilitating the elimination of field stars and the apperception of cluster features. This program is capable of displaying up to 16 different diagrams for one cluster and processing up to 20 clusters at the same time. Our aims are the following: (1) a common UBVRI photometric scale for open clusters, (2) an atlas of color

  15. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing the dis...

  16. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  17. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  18. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  19. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; Moseley, S. H.; Porst, J.-P.; Porter, F. S.; Sadleir, J. E.; Smith, S. J.

    2016-07-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an ^{55}Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  20. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  1. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  2. Adhesive Characterization and Progressive Damage Analysis of Bonded Composite Joints

    Science.gov (United States)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2014-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  3. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  4. Microarray Analysis Techniques Singular Value Decomposition and Principal Component Analysis

    CERN Document Server

    Wall, M E; Rocha, L M; Wall, Michael E.; Rechtsteiner, Andreas; Rocha, Luis M.

    2002-01-01

    This chapter describes gene expression analysis by Singular Value Decomposition (SVD), emphasizing initial characterization of the data. We describe SVD methods for visualization of gene expression data, representation of the data using a smaller number of variables, and detection of patterns in noisy gene expression data. In addition, we describe the precise relation between SVD analysis and Principal Component Analysis (PCA) when PCA is calculated using the covariance matrix, enabling our descriptions to apply equally well to either method. Our aim is to provide definitions, interpretations, examples, and references that will serve as resources for understanding and extending the application of SVD and PCA to gene expression analysis.

  5. Dynamic analysis of large structures by modal synthesis techniques.

    Science.gov (United States)

    Hurty, W. C.; Hart, G. C.; Collins, J. D.

    1971-01-01

    Several criteria that may be used to evaluate the merits of some of the existing techniques for the dynamic analysis of large structures which involve division into substructures or components are examined. These techniques make use of component displacement modes to synthetize global systems of generalized coordinates and, for that reason, they have come to be known as modal synthesis or component mode methods. Two techniques have been found to be particularly useful - i.e., the modal synthesis method with fixed attachment modes, and the modal synthesis method with free attachment modes. These two methods are treated in detail, and general flow charts are presented for guidance in computer programming.

  6. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  7. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  8. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  9. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  10. History, progress and prospect for controlled ecological life support technique in China

    Science.gov (United States)

    Guo, Shuangsheng

    2016-07-01

    Constructing controlled ecological life support system is an important supporting condition for carrying out manned deep-space exploration and extraterrestrial inhabitation and development in the future. In China, the controlled ecological life support technique has gone through a developmental process of more than twenty years, undergoing the course of from conceptual research, to key unit-level technique and key system-level integrated technique, and from ground-based simulated tests to spaceflight demonstrating test, and gained many important stagy harvests. In this paper, the present status, subsistent problems and next plans in the domain of CELSS techniques in China are introduced briefly, so as to play a referential role for promoting development of the techniques internationally.

  11. Micro analysis of disolved gases by the gas chromatography technique

    International Nuclear Information System (INIS)

    A technique which allows the quantitative analysis of small concentration of disolved gases such as CO2 and H2 in the order of 10-6 - 10-3M is discussed. For the extraction, separation and quantification a Toepler pump was used. This is in tandem to a gas chromatography. This method also can be applied for the analysis of other gases like CO, CH4, CH3-CH3 etc. This technique may be applied in fields such as radiation chemistry, oceanography and environmental studies. (author)

  12. Sample preparation techniques in trace element analysis of water

    Science.gov (United States)

    Nagj, Marina; Jakšić, M.; Orlić, I.; Valković, V.

    1985-06-01

    Sample preparation techniques for the analysis of water for trace elements using X-ray emission spectroscopy are described. Fresh water samples for the analysis of transition metals were prepared by complexation with ammonium-pyrrolidine-dithiocarbamate (APDC) and filtering through a membrane filter. Analyses of water samples for halogenes was done on samples prepared by precipitation with AgNO 3 and subsequent filtration. Two techniques for seawater preparation for uranium determination are described, viz. precipitation with APDC in the presence of iron (II) as a carrier and complexation with APDC followed with adsorption on activated carbon. In all cases trace element levels at 10 -3 μg/g were measured.

  13. IAEA progress report II - Study of archeological objects using PIXE analytical technique

    International Nuclear Information System (INIS)

    This is the second IAEA progress report for the period 2006-2007 (CRP number F23023). After adopting the PIXE one-run measurement using the Al funny filter as X-ray absorber which was described in the first progress report, two studies on ceramics were undertaken in order to be characterized based on their chemical composition. The first one concerned the characterization of 38 sherds from the locality of Ch'him (south of Beirut) that could help for future studies on ceramic provenance. Those samples are considered as reference materials as they are coming from kiln and workshop of the excavated site. The second study will be detailed in the current report. It concerned excavated pottery from Beirut, suspected to belong to North-Syrian production. (author)

  14. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    Science.gov (United States)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  15. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  16. Review of geographic processing techniques applicable to regional analysis

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, R.C.

    1988-02-01

    Since the early 1970s regional environmental studies have been carried out at the Oak Ridge National Laboratory using computer-assisted techniques. This paper presents an overview of some of these past experiences and the capabilities developed at the Laboratory for processing, analyzing, and displaying geographic data. A variety of technologies have resulted such as computer cartography, image processing, spatial modeling, computer graphics, data base management, and geographic information systems. These tools have been used in a wide range of spatial applications involving facility siting, transportation routing, coal resource analysis, environmental impacts, terrain modeling, inventory development, demographic studies, water resource analyses, etc. The report discusses a number of topics dealing with geographic data bases and structures, software and processing techniques, hardware systems, models and analysis tools, data acquisition techniques, and graphical display methods. Numerous results from many different applications are shown to aid the reader interested in using geographic information systems for environmental analyses. 15 refs., 64 figs., 2 tabs.

  17. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  18. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  19. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  20. Neutron noise analysis techniques in nuclear power reactors

    International Nuclear Information System (INIS)

    The main techniques used in neutron noise analysis of BWR and PWR nuclear reactors are reviewed. Several applications such as control of vibrations in both reactor types, determination of two phase flow parameters in BWR and stability control in BWR are discussed with some detail. The paper contains many experimental results obtained by the main author of this paper. (author)

  1. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  2. DETECTION OF DNA DAMAGE USING MELTING ANALYSIS TECHNIQUES

    Science.gov (United States)

    A rapid and simple fluorescence screening assay for UV radiation-, chemical-, and enzyme-induced DNA damage is reported. This assay is based on a melting/annealing analysis technique and has been used with both calf thymus DNA and plasmid DNA (puc 19 plasmid from E. coli). DN...

  3. Diffusion tensor analysis of corpus callosum in progressive supranuclear palsy

    International Nuclear Information System (INIS)

    Progressive supranuclear palsy (PSP) is a neurodegenerative disease featuring parkinsonism, supranuclear ophthalmoplegia, dysphagia, and frontal lobe dysfunction. The corpus callosum which consists of many commissure fibers probably reflects cerebral cortical function. Several previous reports showed atrophy or diffusion abnormalities of anterior corpus callosum in PSP patients, but partitioning method used in these studies was based on data obtained in nonhuman primates. In this study, we performed a diffusion tensor analysis using a new partitioning method for the human corpus callosum. Seven consecutive patients with PSP were compared with 29 age-matched patients with Parkinson's Disease (PD) and 19 age-matched healthy control subjects. All subjects underwent diffusion tensor magnetic resonance imaging, and the corpus callosum was partitioned into five areas on the mid-sagittal plane according to a recently established topography of human corpus callosum (CC1-prefrontal area, CC2-premotor and supplementary motor area, CC3-motor area, CC4-sensory area, CC5-parietal, temporal, and occipital area). Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) were measured in each area and differences between groups were analyzed. In the PSP group, FA values were significantly decreased in CC1 and CC2, and ADC values were significantly increased in CC1 and CC2. Receiver operating characteristic analysis showed excellent reliability of FA and ADC analyses of CC1 for differentiating PSP from PD. The anterior corpus callosum corresponding to the prefrontal, premotor, and supplementary motor cortices is affected in PSP patients. This analysis can be an additional test for further confirmation of the diagnosis of PSP

  4. Diffusion tensor analysis of corpus callosum in progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Shoichi; Makino, Takahiro; Shirai, Wakako; Hattori, Takamichi [Department of Neurology, Graduate School of Medicine, Chiba University (Japan)

    2008-11-15

    Progressive supranuclear palsy (PSP) is a neurodegenerative disease featuring parkinsonism, supranuclear ophthalmoplegia, dysphagia, and frontal lobe dysfunction. The corpus callosum which consists of many commissure fibers probably reflects cerebral cortical function. Several previous reports showed atrophy or diffusion abnormalities of anterior corpus callosum in PSP patients, but partitioning method used in these studies was based on data obtained in nonhuman primates. In this study, we performed a diffusion tensor analysis using a new partitioning method for the human corpus callosum. Seven consecutive patients with PSP were compared with 29 age-matched patients with Parkinson's Disease (PD) and 19 age-matched healthy control subjects. All subjects underwent diffusion tensor magnetic resonance imaging, and the corpus callosum was partitioned into five areas on the mid-sagittal plane according to a recently established topography of human corpus callosum (CC1-prefrontal area, CC2-premotor and supplementary motor area, CC3-motor area, CC4-sensory area, CC5-parietal, temporal, and occipital area). Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) were measured in each area and differences between groups were analyzed. In the PSP group, FA values were significantly decreased in CC1 and CC2, and ADC values were significantly increased in CC1 and CC2. Receiver operating characteristic analysis showed excellent reliability of FA and ADC analyses of CC1 for differentiating PSP from PD. The anterior corpus callosum corresponding to the prefrontal, premotor, and supplementary motor cortices is affected in PSP patients. This analysis can be an additional test for further confirmation of the diagnosis of PSP.

  5. Driving forces of change in environmental indicators an analysis based on divisia index decomposition techniques

    CERN Document Server

    González, Paula Fernández; Presno, Mª José

    2014-01-01

    This book addresses several index decomposition analysis methods to assess progress made by EU countries in the last decade in relation to energy and climate change concerns. Several applications of these techniques are carried out in order to decompose changes in both energy and environmental aggregates. In addition to this, a new methodology based on classical spline approximations is introduced, which provides useful mathematical and statistical properties. Once a suitable set of determinant factors has been identified, these decomposition methods allow the researcher to quantify the respec

  6. Developing techniques for cause-responsibility analysis of occupational accidents.

    Science.gov (United States)

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  7. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  8. Microfluidic IEF technique for sequential phosphorylation analysis of protein kinases

    Science.gov (United States)

    Choi, Nakchul; Song, Simon; Choi, Hoseok; Lim, Bu-Taek; Kim, Young-Pil

    2015-11-01

    Sequential phosphorylation of protein kinases play the important role in signal transduction, protein regulation, and metabolism in living cells. The analysis of these phosphorylation cascades will provide new insights into their physiological functions in many biological functions. Unfortunately, the existing methods are limited to analyze the cascade activity. Therefore, we suggest a microfluidic isoelectric focusing technique (μIEF) for the analysis of the cascade activity. Using the technique, we show that the sequential phosphorylation of a peptide by two different kinases can be successfully detected on a microfluidic chip. In addition, the inhibition assay for kinase activity and the analysis on a real sample have also been conducted. The results indicate that μIEF is an excellent means for studies on phosphorylation cascade activity.

  9. Nondestructive analysis of oil shales with PGNAA technique

    International Nuclear Information System (INIS)

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water

  10. Nondestructive analysis of oil shales with PGNAA technique

    Energy Technology Data Exchange (ETDEWEB)

    Maly, J.; Bozorgmanesh, H.

    1984-02-01

    The feasibility of nondestructive analysis of oil shales using the prompt gamma neutron activation analysis (PGNAA) technique was studied. The PGNAA technique, developed originally for continuous analysis of coal on the belt, was applied to the analysis of eight oil-shale samples, containing between 9 and 60 gallons of oil per ton and 0.8% to 3.4% hydrogen. The PGNAA technique was modified using four neutron moderation conditions: non-moderated neutrons; non-moderated and partially moderated neutrons reflected from a water box behind the source; neutrons moderated in a water box behind and in front of the source; and neutrons strongly moderated in a polyethylene block placed in front of the source and with reflected neutrons from a water box behind the source. The studied oil shales were measured in their aluminum or wooden (masonite) boxes. The obtained Ge-Li spectra were processed by LSI-11/23 computer, using the modified programs previously developed by SAI for continuous coal analysis. The results of such processing (the peak areas for several gamma lines) were corrected and plotted against the weight percent of each analyzed element (from the chemical analysis). Response curves developed for H, C, N, S, Na, Mg, Al, Si, Ti, Ca, Fe and K show generally good linear proportions of peak area to the weight percent of the element. For hydrogen determination, NMD conditions had to be used where the response curve was not linear, but followed a curve whose slope rose with hydrogen concentration. This effect is caused by improving neutron self-moderation in sample boxes of rich oil shales, as compared to poor self-moderation of neutrons in very lean oil shales. The moisture in oil shales was measured by microwave absorption technique in small masonite boxes. This method was calibrated four times using oil-shale samples mixed gradually with larger and larger amounts of water.

  11. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  12. A Review on Clustering and Outlier Analysis Techniques in Datamining

    Directory of Open Access Journals (Sweden)

    S. Koteeswaran

    2012-01-01

    Full Text Available Problem statement: The modern world is based on using physical, biological and social systems more effectively using advanced computerized techniques. A great amount of data being generated by such systems; it leads to a paradigm shift from classical modeling and analyses based on basic principles to developing models and the corresponding analyses directly from data. The ability to extract useful hidden knowledge in these data and to act on that knowledge is becoming increasingly important in today's competitive world. Approach: The entire process of applying a computer-based methodology, including new techniques, for discovering knowledge from data is called data mining. There are two primary goals in the data mining which are prediction and classification. The larger data involved in the data mining requires clustering and outlier analysis for reducing as well as collecting only useful data set. Results: This study is focusing the review of implementation techniques, recent research on clustering and outlier analysis. Conclusion: The study aims for providing the review of clustering and outlier analysis technique and the discussion on the study will guide the researcher for improving their research direction.

  13. Treatment planning of adhesive additive rehabilitations: the progressive wax-up of the three-step technique.

    Science.gov (United States)

    Vailati, Francesca; Carciofo, Sylvain

    2016-01-01

    A full-mouth rehabilitation should be correctly planned from the start by using a diagnostic wax-up to reduce the potential for remakes, increased chair time, and laboratory costs. However, determining the clinical validity of an extensive wax-up can be complicated for clinicians who lack the experience of full-mouth rehabilitations. The three-step technique is a simplified approach that has been developed to facilitate the clinician's task. By following this technique, the diagnostic wax-up is progressively developed to the final outcome through the interaction between patient, clinician, and laboratory technician. This article provides guidelines aimed at helping clinicians and laboratory technicians to become more proactive in the treatment planning of full-mouth rehabilitations, by starting from the three major parameters of incisal edge position, occlusal plane position, and the vertical dimension of occlusion.

  14. Treatment planning of adhesive additive rehabilitations: the progressive wax-up of the three-step technique.

    Science.gov (United States)

    Vailati, Francesca; Carciofo, Sylvain

    2016-01-01

    A full-mouth rehabilitation should be correctly planned from the start by using a diagnostic wax-up to reduce the potential for remakes, increased chair time, and laboratory costs. However, determining the clinical validity of an extensive wax-up can be complicated for clinicians who lack the experience of full-mouth rehabilitations. The three-step technique is a simplified approach that has been developed to facilitate the clinician's task. By following this technique, the diagnostic wax-up is progressively developed to the final outcome through the interaction between patient, clinician, and laboratory technician. This article provides guidelines aimed at helping clinicians and laboratory technicians to become more proactive in the treatment planning of full-mouth rehabilitations, by starting from the three major parameters of incisal edge position, occlusal plane position, and the vertical dimension of occlusion. PMID:27433550

  15. Progress in development of a technique to measure the axial thermal diffusivity of irradiated reactor fuel pellets

    Energy Technology Data Exchange (ETDEWEB)

    Hutcheon, R.; Mouris, J. [Atomic Energy of Canada Limited, Chalk River, Ontario (Canada)

    1997-07-01

    A new technique, based on pulsed high-energy ({approx}12 MeV) electron-beam heating, is being developed for measuring the thermal diffusivity of irradiated reactor fuel. This paper reports on the continuing development work required to establish a practical technique for irradiated materials at high temperatures (1000 to 1500 deg C). This includes studies of the influence of thermocouple surface contact resistance, of the sheath and the pellet mounting system, of internal cracks in the pellet, and of the chamber atmosphere. Calibrations with a NIST standard and measurements on fresh UO{sub 2} were done. Progress during the past year in these various areas is reviewed, and initial experiments with a specimen of high-burnup CANDU fuel are discussed. (author)

  16. Learning Progressions and Teaching Sequences: A Review and Analysis

    Science.gov (United States)

    Duschl, Richard; Maeng, Seungho; Sezen, Asli

    2011-01-01

    Our paper is an analytical review of the design, development and reporting of learning progressions and teaching sequences. Research questions are: (1) what criteria are being used to propose a "hypothetical learning progression/trajectory" and (2) what measurements/evidence are being used to empirically define and refine a "hypothetical learning…

  17. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  18. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  19. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  20. Research progress on the brewing techniques of new-type rice wine.

    Science.gov (United States)

    Jiao, Aiquan; Xu, Xueming; Jin, Zhengyu

    2017-01-15

    As a traditional alcoholic beverage, Chinese rice wine (CRW) with high nutritional value and unique flavor has been popular in China for thousands of years. Although traditional production methods had been used without change for centuries, numerous technological innovations in the last decades have greatly impacted on the CRW industry. However, reviews related to the technology research progress in this field are relatively few. This article aimed at providing a brief summary of the recent developments in the new brewing technologies for making CRW. Based on the comparison between the conventional methods and the innovative technologies of CRW brewing, three principal aspects were summarized and sorted, including the innovation of raw material pretreatment, the optimization of fermentation and the reform of sterilization technology. Furthermore, by comparing the advantages and disadvantages of these methods, various issues are addressed related to the prospect of the CRW industry. PMID:27542505

  1. Reduced Incidence of Slowly Progressive Heymann Nephritis in Rats Immunized With a Modified Vaccination Technique

    Directory of Open Access Journals (Sweden)

    Arpad Z. Barabas

    2006-01-01

    Full Text Available A slowly progressive Heymann nephritis (SPHN was induced in three groups of rats by weekly injections of a chemically modified renal tubular antigen in an aqueous medium. A control group of rats received the chemically unmodified version of the antigen in an aqueous solution. One group of SPHN rats were pre- and post-treated with weekly injections of IC made up of rKF3 and rarKF3 IgM antibody at antigen excess (MIC (immune complexes [ICs] containing sonicated ultracentrifuged [u/c] rat kidney fraction 3 [rKF3] antigen and IgM antibodies specific against the antigen, at slight antigen excess. One group of SPHN rats were post-treated with MIC 3 weeks after the induction of the disease and one group of SPHN animals received no treatment. The control group of rats received pre- and post-treatment with sonicated u/c rKF3.

  2. New Progress in High-Precision and High-Resolution Seismic Exploration Technique in Coal Industry of China

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    In the past twenty years, the proportion of coal in primary-energy consumption in China is generally between 71.3% and 76.5%. The output of coal was 1.374 billion tons in 1996, and 1.21 tons in 1998, which ranked first in the world. Now coal is mined mainly with mechanization in China, which is planned to reach 80% in major State-owned coal mines in 2000 according to the planning of the government (Li et al., 1998; Tang Dejin, 1998).Compared with the USA and Australia, China has more complex coal geological structures. Based on high-resolution seismic technique in coal exploration, a new seismic technique with high-precision and high-resolution (2-D and 3-D) has been developed for the purpose of detecting small geological structures in coal mine construction and production to meet the needs of large-scale popularization of mechanized coal mining in China. The technique is low in cost and requires a relatively short period of exploration, with high precision and wide-range applications. In the middle of the 1980s, it began to be used in pre-mining coal exploration on a trial basis, and entered the peak of exploration in the 1990s, which has made significant progress in providing high-precision geological results for the construction and production of coal industry in China, and is still in the ascendant.This paper discusses some new progress and the exploration capability and application range of the technique.

  3. Recent Progress in Evaluation Techniques and Device Applications of Organic and Composite Thin Films

    Science.gov (United States)

    Kato, Keizo; Shinbo, Kazunari; Okamoto, Tetsushi; Aoki, Yusuke; Iechi, Hiroyuki

    Evaluation techniques and the device applications of organic and composite thin films are described. One of the evaluation techniques is the surface plasmon resonance (SPR) spectroscopy. It has become a widely accepted method for the characterization and study of ultrathin films, interfaces and kinetic processes at surfaces, and it has been investigated in the applications of SPR sensors and plasmonic novel devices. Properties, functionalization and various applications of hybrid materials, nanocomposites, and nanoparticles are also introduced, and the application to industry is mentioned. Moreover, electronic device application of monolithic organic logic circuit using a stacked structure of two organic static induction transistors is explained. The advantages of this novel device structure are the controllability of the operational characteristics and simple device fabrication process.

  4. Types of Maize Virus Diseases and Progress in Virus Identification Techniques in China

    Institute of Scientific and Technical Information of China (English)

    Cui Yu; Zhang Ai-hong; Ren Ai-jun; Miao Hong-qin

    2014-01-01

    There are a total of more than 40 reported maize viral diseases worldwide. Five of them have reportedly occurred in China. They are maize rough dwarf disease, maize dwarf mosaic disease, maize streak dwarf disease, maize crimson leaf disease, maize wallaby ear disease and corn lethal necrosis disease. This paper reviewed their occurrence and distribution as well as virus identification techniques in order to provide a basis for virus identification and diagnosis in corn production.

  5. Progress report on reversal and substitute element technique for thread calibration on CMMs

    DEFF Research Database (Denmark)

    Carmignato, Simone; Larsen, Erik; Sobiecki, Rene;

    This report is made as a part of the project EASYTRAC, an EU project under the programme Competitive and Sustainable Growth: Contract No. G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines. For thi......) - Germany and Tampere University of Technology (TUT) - Finland. The present report describes feasibility and preliminary results of a reversal and substitute element technique application for thread calibration....

  6. A Comparison of Imaging Techniques to Monitor Tumor Growth and Cancer Progression in Living Animals

    Directory of Open Access Journals (Sweden)

    Anne-Laure Puaux

    2011-01-01

    Full Text Available Introduction and Purpose. Monitoring solid tumor growth and metastasis in small animals is important for cancer research. Noninvasive techniques make longitudinal studies possible, require fewer animals, and have greater statistical power. Such techniques include FDG positron emission tomography (FDG-PET, magnetic resonance imaging (MRI, and optical imaging, comprising bioluminescence imaging (BLI and fluorescence imaging (FLI. This study compared the performance and usability of these methods in the context of mouse tumor studies. Methods. B16 tumor-bearing mice (n=4 for each study were used to compare practicality, performance for small tumor detection and tumor burden measurement. Using RETAAD mice, which develop spontaneous melanomas, we examined the performance of MRI (n=6 mice and FDG-PET (n=10 mice for tumor identification. Results. Overall, BLI and FLI were the most practical techniques tested. Both BLI and FDG-PET identified small nonpalpable tumors, whereas MRI and FLI only detected macroscopic, clinically evident tumors. FDG-PET and MRI performed well in the identification of tumors in terms of specificity, sensitivity, and positive predictive value. Conclusion. Each of the four methods has different strengths that must be understood before selecting them for use.

  7. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  8. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Charlton, William S

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  9. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  10. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  11. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  12. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  13. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  14. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  15. Data progressing technique for data measured in MO image measurement system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Wongi; Lee, Hyo Yeon; Youm, Do Jun [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Yoo, Jaeu [Chonbuk National University, Jeonju (Korea, Republic of)

    2013-05-15

    We report processing technique in the MO image measurement system. Calibration procedure is not only considered to perpendicular field but also in-plane field. Current density and field profiles are obtained by Biot-savart law and inversion method. We show example of (Gd,Y){sub 1}Ba{sub 2}Cu{sub 3}O{sub 7δ}-BaZrO{sub 3} film that have tilted nano rod pinning centers about 13 degree from the c-axis.

  16. Analysis of questioning technique during classes in medical education

    Directory of Open Access Journals (Sweden)

    Cho Young

    2012-06-01

    Full Text Available Abstract Background Questioning is one of the essential techniques used by lecturers to make lectures more interactive and effective. This study surveyed the perception of questioning techniques by medical school faculty members and analyzed how the questioning technique is used in actual classes. Methods Data on the perceptions of the questioning skills used during lectures was collected using a self‒questionnaire for faculty members (N = 33 during the second semester of 2008. The questionnaire consisted of 18 items covering the awareness and characteristics of questioning skills. Recorded video tapes were used to observe the faculty members’ questioning skills. Results Most faculty members regarded the questioning technique during classes as being important and expected positive outcomes in terms of the students’ participation in class, concentration in class and understanding of the class contents. In the 99 classes analyzed, the median number of questions per class was 1 (0–29. Among them, 40 classes (40.4 % did not use questioning techniques. The frequency of questioning per lecture was similar regardless of the faculty members’ perception. On the other hand, the faculty members perceived that their usual wait time after question was approximately 10 seconds compared to only 2.5 seconds measured from video analysis. More lecture‒experienced faculty members tended to ask more questions in class. Conclusions There were some discrepancies regarding the questioning technique between the faculty members’ perceptions and reality, even though they had positive opinions of the technique. The questioning skills during a lecture need to be emphasized to faculty members.

  17. Pulsed Photonuclear Assessment (PPA) Technique: CY 04 Year-end Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    J.L. Jones; W.Y. Yoon; K.J. Haskell; D.R. Norman; J.M. Zabriskie; J.W. Sterbentz; S.M. Watson; J.T. Johnson; B.D. Bennett; R.W. Watson; K. L. Folkman

    2005-05-01

    Idaho National Laboratory (INL), along with Los Alamos National Laboratory (LANL) and Idaho State University’s Idaho Accelerator Center (IAC), are developing an electron accelerator-based, photonuclear inspection technology for the detection of smuggled nuclear material within air-, rail-, and especially, maritime-cargo transportation containers. This CY04 report describes the latest developments and progress with the development of the Pulsed, Photonuclear Assessment (PPA) nuclear material inspection ystem, such as: (1) the identification of an optimal range of electron beam energies for interrogation applications, (2) the development of a new “cabinet safe” electron accelerator (i.e., Varitron II) to assess “cabinet safe-type” operations, (3) the numerical and experimental validation responses of nuclear materials placed within selected cargo configurations, 4) the fabrication and utilization of Calibration Pallets for inspection technology performance verification, 5) the initial technology integration of basic radiographic “imaging/mapping” with induced neutron and gamma-ray detection, 6) the characterization of electron beam-generated photon sources for optimal performance, 7) the development of experimentallydetermined Receiver-Operator-Characterization curves, and 8) several other system component assessments. This project is supported by the Department of Homeland Security and is a technology component of the Science & Technology Active Interrogation Portfolio entitled “Photofission-based Nuclear Material Detection and Characterization.”

  18. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  19. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  20. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  1. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  2. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  3. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  4. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    Science.gov (United States)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  5. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Shahid Ali

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  6. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    Directory of Open Access Journals (Sweden)

    Alexander Hexemer

    2015-01-01

    Full Text Available The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS, new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.

  7. Techniques for getting the most from an evaluation: Review of methods and results for attributing progress, non-energy benefits, net to gross, and cost-benefit

    International Nuclear Information System (INIS)

    As background for several evaluation and attribution projects, the authors conducted research on best practices in a few key areas of evaluation. We focused on techniques used in measuring market progress, enhanced techniques in attributing net energy impacts, and examining omitted program effects, particularly net non-energy benefits. The research involved a detailed literature review, interviews with program managers and evaluators across the US, and refinements of techniques used by the authors in conducting evaluation work. The object of the research was to uncover successful (and unsuccessful) approaches being used for key aspects of evaluation work. The research uncovered areas of tracking that are becoming more commonly used by agencies to assess progress in the market. In addition, detailed research by the authors on a number of impact and attribution evaluations have also led to recommendations on key practices that we believe comprise elements of best practices for assessments of attributable program effects. Specifically, we have identified a number of useful steps to improve the attribution of impacts to program interventions. Information on techniques for both attribution/causality work for a number of programs are presented - including market transformation programs that rely on marketing, advertising, training, and mid-stream incentives and work primarily with a network of participating mid-market actors. The project methods and results are presented and include: Theory-based evaluation, indicators, and hypothesis testing; Enhanced measurement of free riders, spillover, and other effects, and attribution of impacts using distribution and ranges of measure and intervention impacts, rather than less reliable point estimates; Attribution of program-induced non-energy benefits; Net to gross, benefit cost analysis, and incorporation of scenario/risk analysis of results; Comparison of net to gross results across program types to explore patterns and

  8. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  9. Progress in bionic information processing techniques for an electronic nose based on olfactory models

    Institute of Scientific and Technical Information of China (English)

    LI Guang; FU Jun; ZHANG Jia; ZHENG JunBao

    2009-01-01

    As a novel bionic analytical technique, an electronic nose, inspired by the mechanism of the biological olfactory system and integrated with modern sensing technology, electronic technology and pattern recognition technology, has been widely used in many areas. Moreover, recent basic research findings in biological olfaction combined with computational neuroscience promote its development both in methodology and application. In this review, the basic information processing principle of biological olfaction and artificial olfaction are summarized and compared, and four olfactory models and their applications to electronic noses are presented. Finally, a chaotic olfactory neural network is detailed and the utilization of several biologically oriented learning rules and its spatiotemporal dynamic prop-ties for electronic noses are discussed. The integration of various phenomena and their mechanisms for biological olfaction into an electronic nose context for information processing will not only make them more bionic, but also perform better than conventional methods. However, many problems still remain, which should be solved by further cooperation between theorists and engineers.

  10. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  11. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  12. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  13. Monitoring the progress of build-up formation in fatty alcohol plant pipelines using gamma-ray scanning techniques

    International Nuclear Information System (INIS)

    A study was conducted to monitor the progress of material build-up formation in fatty acid alcohol pipelines using gamma ray absorption techniques. The investigation was periodically performed at few selected location which has been defined as critical area. Before performing a scan, the intensity of the gamma ray as a reference at the clean pipe should be determined. From the gamma ray absorption principle, the intensity of the radiation initial and the radiation after it pass through a material should be different, so the thickness of the build-up in the pipeline can be determined. As a result, base on this early information of the actual condition of the build-up formation, the more effective maintenance schedule can be planned. From that, the maintenance cost which is due to the build-up formation could be minimise as low as possible. (Author)

  14. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  15. Comparative Analysis of Partial Occlusion Using Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    N.Nallammal

    2013-04-01

    Full Text Available This paper presents a comparison of partial occlusion using face recognition techniques that gives in which technique produce better result for total success rate. The partial occlusion of face recognition is especially useful for people where part of their face is scarred and defect thus need to be covered. Hence, either top part/eye region or bottom part of face will be recognized respectively. The partial face information are tested with Principle Component Analysis (PCA, Non-negative matrix factorization (NMF, Local NMF (LNMF and Spatially Confined NMF (SFNMF. The comparative results show that the recognition rate of 95.17% with r = 80 by using SFNMF for bottom face region. On the other hand, eye region achieves 95.12% with r = 10 by using LNMF.

  16. Application of thermal analysis techniques in activated carbon production

    Science.gov (United States)

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  17. Gamma absorption technique in elemental analysis of composite materials

    International Nuclear Information System (INIS)

    Highlights: ► Application of gamma-ray absorption technique in elemental analysis. ► Determination of elemental composition of some bronze and gold alloys. ► Determination of some heavy elements in water. - Abstract: Expressions for calculating the elemental concentrations of composite materials based on a gamma absorption technique are derived. These expressions provide quantitative information about elemental concentrations of materials. Calculations are carried out for estimating the concentrations of copper and gold in some alloys of bronze and gold. The method was also applied for estimating the concentrations of some heavy elements in a water matrix highlighting the differences with photon attenuation measurements. Theoretical mass attenuation coefficient values were obtained using the WinXCom program. A high-resolution gamma-ray spectrometry based on high purity germanium detector (HPGe) was employed to measure the attenuation of a strongly collimated monoenergetic gamma beam through samples.

  18. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  19. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author)

  20. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  1. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  2. Progress in CTEQ/TEA global QCD analysis

    CERN Document Server

    Nadolsky, P M; Lai, H -L; Pumplin, J; Yuan, C -P

    2009-01-01

    We overview progress in the development of general-purpose CTEQ PDFs. The preprint is based on four talks presented by H.-L. Lai and P. Nadolsky at the 17th International Workshop on Deep Inelastic Scattering and Related Subjects (DIS 2009).

  3. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  4. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  5. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  6. Multi-element study in aluminium by activation analysis technique

    International Nuclear Information System (INIS)

    The instrumental activation analysis is a technique relatively quickly that help to know the elemental composition of materials. It is used mainly in the trace elements determination but in the case of major elements it is necessary to make some considerations as the different nuclear reactions carried out due to the neutron flux is a mixture of thermal and fast neutrons. This could be interpreted for the presence and or erroneous quantification about some elements. In this work, is described the way in which was analyzed a container piece with approximately a 85% of aluminium. The elements Zn, Mn, Sb, Ga, Cu, Cl and Sm were determined. (Author)

  7. Acceleration of multivariate analysis techniques in TMVA using GPUs

    CERN Document Server

    Hoecker, A; Therhaag, J; Washbrook, A

    2012-01-01

    A feasibility study into the acceleration of multivariate analysis techniques using Graphics Processing Units (GPUs) will be presented. The MLP-based Artificial Neural Network method contained in the TMVA framework has been chosen as a focus for investigation. It was found that the network training time on a GPU was lower than for CPU execution as the complexity of the network was increased. In addition, multiple neural networks can be trained simultaneously on a GPU within the same time taken for single network training on a CPU. This could be potentially leveraged to provide a qualitative performance gain in data classification.

  8. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  9. Experimental and Automated Analysis Techniques for High-resolution Electrical Mapping of Small Intestine Slow Wave Activity

    OpenAIRE

    Angeli, Timothy R.; O'Grady, Gregory; Paskaranandavadivel, Niranchan; Jonathan C Erickson; Du, Peng; Pullan, Andrew J; Bissett, Ian P.; Cheng, Leo K

    2013-01-01

    Background/Aims Small intestine motility is governed by an electrical slow wave activity, and abnormal slow wave events have been associated with intestinal dysmotility. High-resolution (HR) techniques are necessary to analyze slow wave propagation, but progress has been limited by few available electrode options and laborious manual analysis. This study presents novel methods for in vivo HR mapping of small intestine slow wave activity. Methods Recordings were obtained from along the porcine...

  10. The Analysis of the Thematic Progression Patterns in "The Great Learning"

    Institute of Scientific and Technical Information of China (English)

    王利娜; 金俊淑

    2007-01-01

    This paper intends to introduce briefly the thematic progression patterns in Systemic- Functional Grammar, then analyze its application in "The Great Learning" which is one of the classics of the Confucius and his disciples. The analysis of the thematic progression patterns of "The Great Learning" is meaningful for both understanding and appreciating "The Great Learning".

  11. Recovering prehistoric woodworking skills using spatial analysis techniques

    Science.gov (United States)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  12. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  13. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  14. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    at different operating conditions, and the application of metabolic engineering to process optimization is, therefore, expected mainly to have an impact on the improvement of processes where yield, productivity, and titer are important design factors, i.e., in the production of metabolites and industrial...... enzymes. Despite the prospect of obtaining major improvement through metabolic engineering, this approach is, however, not expected to completely replace the classical approach to strain improvement-random mutagenesis followed by screening. Identification of the optimal genetic changes for improvement...... analysis of pathways, and (5) kinetic modeling. In this article, these different techniques are discussed and their applications to the analysis of different processes are illustrated. (C) 1998 John Wiley & Sons, Inc....

  15. Validation of Design and Analysis Techniques of Tailored Composite Structures

    Science.gov (United States)

    Jegley, Dawn C. (Technical Monitor); Wijayratne, Dulnath D.

    2004-01-01

    Aeroelasticity is the relationship between the elasticity of an aircraft structure and its aerodynamics. This relationship can cause instabilities such as flutter in a wing. Engineers have long studied aeroelasticity to ensure such instabilities do not become a problem within normal operating conditions. In recent decades structural tailoring has been used to take advantage of aeroelasticity. It is possible to tailor an aircraft structure to respond favorably to multiple different flight regimes such as takeoff, landing, cruise, 2-g pull up, etc. Structures can be designed so that these responses provide an aerodynamic advantage. This research investigates the ability to design and analyze tailored structures made from filamentary composites. Specifically the accuracy of tailored composite analysis must be verified if this design technique is to become feasible. To pursue this idea, a validation experiment has been performed on a small-scale filamentary composite wing box. The box is tailored such that its cover panels induce a global bend-twist coupling under an applied load. Two types of analysis were chosen for the experiment. The first is a closed form analysis based on a theoretical model of a single cell tailored box beam and the second is a finite element analysis. The predicted results are compared with the measured data to validate the analyses. The comparison of results show that the finite element analysis is capable of predicting displacements and strains to within 10% on the small-scale structure. The closed form code is consistently able to predict the wing box bending to 25% of the measured value. This error is expected due to simplifying assumptions in the closed form analysis. Differences between the closed form code representation and the wing box specimen caused large errors in the twist prediction. The closed form analysis prediction of twist has not been validated from this test.

  16. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Directory of Open Access Journals (Sweden)

    Amin Torabipour

    2014-11-01

    Full Text Available This study aimed to measure the hospital productivity using data envelopment analysis (DEA technique and Malmquist indices.This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software.Six hospitals (50% had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05 (except in 2009 years.Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  17. Use of statistical techniques in analysis of biological data

    Directory of Open Access Journals (Sweden)

    Farzana Perveen

    2012-07-01

    Full Text Available Starting from the ancient age to the modern times not a single area can be found where statistics is not playing a vital role. Statistics has now been recognized and universally accepted as an essential component of research in every branch of science. Starting from agriculture, biology, education, economics, business, management, medical, engineering, psychology, environment and space, statistics is playing significant role. Statistics is being extensively used in biological sciences. Specifically, biostatistics is the branch of applied statistics that concerns the application of statistical methods to medical, genetics and biological problems. In the sequel, one important step is the appropriate and careful analysis of statistical data to get precise results. It is pertinent to mention that majority of statistical tests and techniques are applied under certain mathematical assumptions. Therefore, it is necessary to realize the importance of relevant assumptions. In this connection, among other assumptions, the assumption of normality (normal distribution of population(s and variance homogeneity etc. are the most important. If these assumptions are not satisfied, the results may be potentially misleading. It is, therefore, suggested to check the relevant assumption(s about the data before applying statistical test(s to get valid results. In this study, a few techniques/tests have been described for checking the normality of a given set of data. Since the Analysis of variance (ANOVA models are extensively used in biological research, therefore, the assumptions underlying the ANOVA have also been discussed. Non-parametric statistics is also described to some extent.

  18. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    Science.gov (United States)

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  19. Homogenization techniques for the analysis of porous SMA

    Science.gov (United States)

    Sepe, V.; Auricchio, F.; Marfia, S.; Sacco, E.

    2016-05-01

    In this paper the mechanical response of porous Shape Memory Alloy (SMA) is modeled. The porous SMA is considered as a composite medium made of a dense SMA matrix with voids treated as inclusions. The overall response of this very special composite is deduced performing a micromechanical and homogenization analysis. In particular, the incremental Mori-Tanaka averaging scheme is provided; then, the Transformation Field Analysis procedure in its uniform and nonuniform approaches, UTFA and NUTFA respectively, are presented. In particular, the extension of the NUTFA technique proposed by Sepe et al. (Int J Solids Struct 50:725-742, 2013) is presented to investigate the response of porous SMA characterized by closed and open porosity. A detailed comparison between the outcomes provided by the Mori-Tanaka, the UTFA and the proposed NUTFA procedures for porous SMA is presented, through numerical examples for two- and three-dimensional problems. In particular, several values of porosity and different loading conditions, inducing pseudoelastic effect in the SMA matrix, are investigated. The predictions assessed by the Mori-Tanaka, the UTFA and the NUTFA techniques are compared with the results obtained by nonlinear finite element analyses. A comparison with experimental data available in literature is also presented.

  20. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  1. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  2. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  3. Human Capital Investment and an Analysis of Its Progressive Profit

    Institute of Scientific and Technical Information of China (English)

    张德平; 孙诚

    2004-01-01

    Skilled labor force cultivated through putting in funds and time in their education are undoubtedly essential in the operation of sophisticated machines in production, but it is so also in the creation of new ideas and methods in production and other economic activities, and ultimately in the promotion of the progressive increase of material capital. Thus strengthening the investment of human capital and enriching the stock of human capital is of primary importance, especially for China, in the 21st century.

  4. High-level power analysis and optimization techniques

    Science.gov (United States)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  5. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  6. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample. PMID:27030469

  7. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  8. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  9. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  10. Nuclear fuel lattice performance analysis by data mining techniques

    International Nuclear Information System (INIS)

    Highlights: • This paper shows a data mining application to analyse nuclear fuel lattice designs. • Data mining methods were used to predict if fuel lattices could operate in an adequate way into the BWR reactor core. • Data mining methods learned from fuel lattice datasets simulated with SIMULATE-3. • Results show high recognition percentages of adequate or inadequate fuel lattice performance. - Abstract: In this paper a data mining analysis for BWR nuclear fuel lattice performance is shown. In a typical three-dimensional simulation of the reactor operation simulator gives the core performance for a fuel lattice configuration measured by thermal limits, shutdown margin and produced energy. Based on these results we can determine the number of fulfilled parameters of a fuel lattice configuration. It is interesting to establish a relationship between the fuel lattice properties and the number of fulfilled core parameters in steady state reactor operation. So, with this purpose data mining techniques were used. Results indicate that these techniques are able to predict with enough accuracy (greater than 75%) if a given fuel lattice configuration will have a either “good” or “bad” performance according to reactor core simulation. In this way, they could be coupled with an optimization process to discard fuel lattice configurations with poor performance and, in this way accelerates the optimization process. Data mining techniques apply some filter methods to discard those variables with lower influence in the number of core fulfilled parameter. From this situation, it was also possible to identify a set of variables to be used in new optimization codes with different objective functions than those normally used

  11. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Directory of Open Access Journals (Sweden)

    Mahmoud I. Al-Kadi

    2013-05-01

    Full Text Available Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  12. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    Science.gov (United States)

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  13. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  14. Post Buckling Progressive Failure Analysis of Composite Laminated Stiffened Panels

    Science.gov (United States)

    Anyfantis, Konstantinos N.; Tsouvalis, Nicholas G.

    2012-06-01

    The present work deals with the numerical prediction of the post buckling progressive and final failure response of stiffened composite panels based on structural nonlinear finite element methods. For this purpose, a progressive failure model (PFM) is developed and applied to predict the behaviour of an experimentally tested blade-stiffened panel found in the literature. Failure initiation and propagation is calculated, owing to the accumulation of the intralaminar failure modes induced in fibre reinforced composite materials. Hashin failure criteria have been employed in order to address the fiber and matrix failure modes in compression and tension. On the other hand, the Tsai-Wu failure criterion has been utilized for addressing shear failure. Failure detection is followed with the introduction of corresponding material degradation rules depending on the individual failure mechanisms. Failure initiation and failure propagation as well as the post buckling ultimate attained load have been numerically evaluated. Final failure behaviour of the simulated stiffened panel is due to sudden global failure, as concluded from comparisons between numerical and experimental results being in good agreement.

  15. Progress on radiochemical analysis for nuclear waste management in decommissioning

    Energy Technology Data Exchange (ETDEWEB)

    Hou, X. (Technical Univ. of Denmark. Center for Nuclear Technologies (NuTech), Roskilde (Denmark))

    2012-01-15

    This report summarized the progress in the development and improvement of radioanalytical methods for decommissioning and waste management completed in the NKS-B RadWaste 2011 project. Based on the overview information of the analytical methods in Nordic laboratories and requirement from the nuclear industry provided in the first phase of the RadWaste project (2010), some methods were improved and developed. A method for efficiently separation of Nb from nuclear waste especially metals for measurement of long-lived 94Nb by gamma spectrometry was developed. By systematic investigation of behaviours of technetium in sample treatment and chromatographic separation process, an effective method was developed for the determination of low level 99Tc in waste samples. An AMS approachment was investigated to measure ultra low level 237Np using 242Pu for AMS normalization, the preliminary results show a high potential of this method. Some progress on characterization of waste for decommissioning of Danish DR3 is also presented. (Author)

  16. An Archetypal Analysis on The Pilgrim’s Progress

    Institute of Scientific and Technical Information of China (English)

    杨洛琪

    2014-01-01

    John Bunyan (1628-1688) is one of the most remarkable figures in 17th century English literature.He is famous for his authorship of The Pilgrim’s Progress and becomes one of the world’s most widely-read Christian writers.This thesis attempts to use the archetypal theories to analyze the archetypes on Christian culture in The Pilgrim’s Progress.According to the theory of archetype, Bunyan’s use of biblical images and themes can be called archetypes.Therefore, this thesis tries to explore the underlying archetypal elements so as to represent its literary treasures by resorting to the theory of archetypal criticism.%约翰·班扬是十七世纪英国文学史上最伟大的作家之一,其著作《天路历程》让他成为最受欢迎的基督教作家。本文从原型批评理论的角度阐释了《天路历程》中来自基督教的文化原型。这些原型涉及作品的圣经意象:水和主题两个方面。从原型角度研究了《天路历程》的根据及重要性。

  17. Image analysis technique applied to lock-exchange gravity currents

    Science.gov (United States)

    Nogueira, Helena I. S.; Adduce, Claudia; Alves, Elsa; Franca, Mário J.

    2013-04-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in the image relating the amount of dye uniformly distributed in the tank and the greyscale values in the corresponding images. The results are evaluated and corrected by applying the mass conservation principle within the experimental tank. The procedure is a simple way to assess the time-varying density distribution within the gravity current, allowing the investigation of gravity current dynamics and mixing processes.

  18. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  19. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  20. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  1. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  2. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  3. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  4. Expert rowers’ motion analysis for synthesis and technique digitalization

    Directory of Open Access Journals (Sweden)

    Filippeschi Alessandro

    2011-12-01

    Full Text Available Four expert rowers’ gestures were gathered on the SPRINT rowing platform with the aid of an optic motion tracking system. Data were analyzed in order to get a digital representation of the features involved in rowing. Moreover, these data provide a dataset for developing digital models for rowing motion synthesis. Rowers were modeled as kinematic chains, data were processed in order to get position and orientation of upper body limbs. This representation was combined with SPRINT data in order to evaluate features found in the literature, to find new ones and to build models for the generation of rowing motion. The analysis shows the effectiveness of the motion reconstruction and two examples of technique features: stroke timing and upper limbs orientation during the finish phase.

  5. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  6. A review of signal processing techniques for heart sound analysis in clinical diagnosis.

    Science.gov (United States)

    Emmanuel, Babatunde S

    2012-08-01

    This paper presents an overview of approaches to analysis of heart sound signals. The paper reviews the milestones in the development of phonocardiogram (PCG) signal analysis. It describes the various stages involved in the analysis of heart sounds and discrete wavelet transform as a preferred method for bio-signal processing. In addition, the gaps that still exist between contemporary methods of signal analysis of heart sounds and their applications for clinical diagnosis is reviewed. A lot of progress has been made but crucial gaps still exist. The findings of this review paper are as follows: there is a lack of consensus in research outputs; inter-patient adaptability of signal processing algorithm is still problematic; the process of clinical validation of analysis techniques was not sufficiently rigorous in most of the reviewed literature; and as such data integrity and measurement are still in doubt, which most of the time led to inaccurate interpretation of results. In addition, the existing diagnostic systems are too complex and expensive. The paper concluded that the ability to correctly acquire, analyse and interpret heart sound signals for improved clinical diagnostic processes has become a priority.

  7. Meta-analysis in Stata: history, progress and prospects

    OpenAIRE

    Jonathan Sterne

    2004-01-01

    Systematic reviews of randomised trials are now widely recognised to be the best way to summarise the evidence on the effects of medical interventions. A systematic review may (though it need not) contain a meta-analysis, `a statistical analysis which combines the results of several independent studies considered by the analyst to be "combinable" '. The first researcher to do a meta-analysis was probably Karl Pearson, in 1904. Sadly, Stata was not available at this time. The first Stata comma...

  8. 蓝莓深加工技术研究进展%Progress of Blueberry Deep Processing Technique Research

    Institute of Scientific and Technical Information of China (English)

    叶春苗

    2015-01-01

    Blueberry has high nutrition value and economic value. Blueberry deep processing is an effective way for solving the problem of blueberry hard storage. In the article, it stated the progress of blueberry deep processing technique research from the aspects of blue-berry dairy product processing, blueberry fruit juice and wine processing, blueberry jam processing and blueberry preserved fruit pro-cessing, in order to provide a reference for the development of blueberry industry.%蓝莓具有很高的营养价值和经济价值.对蓝莓进行深加工处理是解决蓝莓不耐贮藏问题的有效途径.从蓝莓乳制品加工、蓝莓果汁果酒饮料加工、蓝莓酱加工、蓝莓果脯加工方面综述蓝莓深加工技术的研究进展,以期为蓝莓产业的发展提供参考.

  9. Progress in identifying a human ionizing-radiation repair gene using DNA-mediated gene transfer techniques

    International Nuclear Information System (INIS)

    The authors employing DNA-mediated gene transfer techniques in introducing human DNA into a DNA double-strand break (DSB) repair deficient Chinese hamster (CHO) cell mutant (xrs-6), which is hypersensitive to both X-rays (D0 = 0.39 Gy) and the antibiotic bleomycin (D0 = 0.01 μg/ml). High molecular weight DNA isolated from cultured human skin fibroblasts was partially digested with restriction enzyme Sau 3A to average sizes of 20 or 40 Kb, ligated with plasmid pSV2-gpt DNA, and transfected into xrs-6 cells. Colonies which developed under a bleomycin and MAX (mycophenolic acid/adenine/xanthine) double-selection procedure were isolated and further tested for X-ray sensitivity and DSB rejoining capacity. To date a total of six X-ray or bleomycin resistant transformants have been isolated. All express rejoining capacity for X-ray-induced DSB, similar to the rate observed for DSB repair in CHO wild type cells. DNA isolated from these primary transformants contain various copy numbers of pSV2-gpt DNA and also contain human DNA sequences as determined by Southern blot hybridization. Recently, a secondary transformant has been isolated using DNA from one of the primary transformants. Cellular and molecular characterization of this transformant is in progress. DNA from a genuine secondary transformant will be used in the construction of a DNA library to isolate human genomic DNA encoding this radiation repair gene

  10. Efficient geometric rectification techniques for spectral analysis algorithm

    Science.gov (United States)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  11. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  12. Sensitivity-analysis techniques: self-teaching curriculum

    Energy Technology Data Exchange (ETDEWEB)

    Iman, R.L.; Conover, W.J.

    1982-06-01

    This self teaching curriculum on sensitivity analysis techniques consists of three parts: (1) Use of the Latin Hypercube Sampling Program (Iman, Davenport and Ziegler, Latin Hypercube Sampling (Program User's Guide), SAND79-1473, January 1980); (2) Use of the Stepwise Regression Program (Iman, et al., Stepwise Regression with PRESS and Rank Regression (Program User's Guide) SAND79-1472, January 1980); and (3) Application of the procedures to sensitivity and uncertainty analyses of the groundwater transport model MWFT/DVM (Campbell, Iman and Reeves, Risk Methodology for Geologic Disposal of Radioactive Waste - Transport Model Sensitivity Analysis; SAND80-0644, NUREG/CR-1377, June 1980: Campbell, Longsine, and Reeves, The Distributed Velocity Method of Solving the Convective-Dispersion Equation, SAND80-0717, NUREG/CR-1376, July 1980). This curriculum is one in a series developed by Sandia National Laboratories for transfer of the capability to use the technology developed under the NRC funded High Level Waste Methodology Development Program.

  13. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    Science.gov (United States)

    Lungaroni, M.; Murari, A.; Peluso, E.; Gelfusa, M.; Malizia, A.; Vega, J.; Talebzadeh, S.; Gaudio, P.

    2016-04-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  14. Mass spectrometry based imaging techniques for spatially resolved analysis of molecules

    Directory of Open Access Journals (Sweden)

    Andrea eMatros

    2013-04-01

    Full Text Available Higher plants are composed of a multitude of tissues with specific functions, reflected by distinct profiles for transcripts, proteins and metabolites. Comprehensive analysis of metabolites and proteins has advanced tremendously within recent years, and this progress has been driven by the rapid development of sophisticated mass spectrometrical techniques. In most of the current omics-studies, analysis is performed on whole organ or whole plant extracts, rendering to the loss of spatial information. Mass spectrometry based imaging (MSI techniques have opened a new avenue to obtain information on the spatial distribution of metabolites and of proteins. Pioneered in the field of medicine, the approaches are now applied to study the spatial profiles of molecules in plant systems. A range of different plant organs and tissues have been successfully analyzed by MSI, and patterns of various classes of metabolites from primary and secondary metabolism could be obtained. It can be envisaged that MSI approaches will substantially contribute to build spatially resolved biochemical networks.

  15. Development Progress of Segmented Gamma Scanning Analysis Equipment

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    <正>The measurement technology of segmented gamma scanning (SGS) has width application in fields of nuclear material and nuclear waster because its advantage of non-destroy analysis for non-uniform object.

  16. 1985. Annual progress report

    International Nuclear Information System (INIS)

    This annual progress report of the CEA Protection and Nuclear Safety Institut outlines a description of the progress made in each sections of the Institut Research activities of the different departments include: reactor safety analysis, fuel cycle facilities analysis; and associated safety research programs (criticality, sites, transport ...), radioecology and environmental radioprotection techniques; data acquisition on radioactive waste storage sites; radiation effects on man, studies on radioprotection techniques; nuclear material security including security of facilities, security of nuclear material transport, and monitoring of nuclear material management; nuclear facility decommissioning; and finally the public information

  17. A dynamic mechanical analysis technique for porous media

    Science.gov (United States)

    Pattison, Adam J; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-01-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a non-linear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1 – 14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  18. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  19. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40th ∼ 50th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  20. Research progress in nonlinear analysis of heart electric activities

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Nonlinear science research is a hot point in the world. It has deepened our cognition of determinism and randomicity, simplicity and complexity, noise and order and it will profoundly influence the progress of the study of natural science, including life science.Life is the most complex nonlinear system and heart is the core of lifecycle system. In the late more than 20 years, nonlinear research on heart electric activities has made much headway. The commonly used parameters are based on chaos and fractal theory, such as correlation dimension, Lyapunov exponent, Kolmogorov entropy and multifractal singularity spectrum. This paper summarizes the commonly used methods in the nonlinear study of heart electric signal. Then, considering the shortages of the above traditional nonlinear parameters, we mainly introduce the results on short-term heart rate variability (HRV) signal (500 R-R intervals) and HFECG signal (1-2s). Finally, we point out it is worthwhile to put emphasis on the study of the sensitive nonlinearity parameters of short-term heart electric signal and their dynamic character and clinical effectivity.

  1. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    International Nuclear Information System (INIS)

    This final report summarizes the accomplishments of a two year research project entitled ''Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed

  2. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled ``Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  3. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  4. Molten metal analysis by laser produced plasmas. Technical progress report

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong W.

    1994-02-01

    A new method of molten metal analysis, based on time- and space-resolved spectroscopy of a laser-produced plasma (LPP) plume of a molten metal surface, has been implemented in the form of a prototype LPP sensor-probe, allowing in-situ analysis in less than 1 minute. The research at Lehigh University has been structured in 3 phases: laboratory verification of concept, comparison of LPP method with conventional analysis of solid specimens and field trials of prototype sensor-probe in small-scale metal shops, and design/production/installation of two sensor-probes in metal production shops. Accomplishments in the first 2 phases are reported. 6 tabs, 3 figs.

  5. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  6. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  7. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag+, Ba2+, and Cd2+ in the concentration range from 10 ng/g to 1 μg/g; for Cu2+ and Pb2+ from 10 ng/g to 5 μg/g; and for Hg2+ from 10 ng/g to 10 μg/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 μg/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag+, Ba2+, Cd2+, Cu2+, and Pb2+, share common binding sites with binding efficiencies varying in the sequence of Pb2+>Cu2+>Ag2+>Cd2+>Ba2+. The binding of Hg2+ involved a different binding site with an increase in binding efficiency in the presence of Ag+. (orig.)

  8. Damage Detection and Analysis in CFRPs Using Acoustic Emission Technique

    Science.gov (United States)

    Whitlow, Travis Laron

    Real time monitoring of damage is an important aspect of life management of critical structures. Acoustic emission (AE) techniques allow for measurement and assessment of damage in real time. Acoustic emission parameters such as signal amplitude and duration were monitored during the loading sequences. Criteria that can indicate the onset of critical damage to the structure were developed. Tracking the damage as it happens gives a better analysis of the failure evolution that will allow for a more accurate determination of structural life. The main challenge is distinguishing between legitimate damage signals and "false positives" which are unrelated to damage growth. Such false positives can be related to electrical noise, friction, or mechanical vibrations. This research focuses on monitoring signals of damage growth in carbon fiber reinforced polymers (CFRPs) and separating the relevant signals from the false ones. In this Dissertation, acoustic emission signals from CFRP specimens were experimentally recorded and analyzed. The objectives of this work are: (1) perform static and fatigue loading of CFRP composite specimens and measure the associated AE signals, (2) accurately determine the AE parameters (energy, frequency, duration, etc.) of signals generated during failure of such specimens, (3) use fiber optic sensors to monitor the strain distribution of the damage zone and relate these changes in strain measurements to AE data.

  9. Comparative Analysis of Different LIDAR System Calibration Techniques

    Science.gov (United States)

    Miller, M.; Habib, A.

    2016-06-01

    With light detection and ranging (LiDAR) now being a crucial tool for engineering products and on the fly spatial analysis, it is necessary for the user community to have standardized calibration methods. The three methods in this study were developed and proven by the Digital Photogrammetry Research Group (DPRG) for airborne LiDAR systems and are as follows; Simplified, Quasi-Rigorous, and Rigorous. In lieu of using expensive control surfaces for calibration, these methods compare overlapping LiDAR strips to estimate the systematic errors. These systematic errors are quantified by these methods and include the lever arm biases, boresight biases, range bias and scan angle scale bias. These three methods comprehensively represent all of the possible flight configurations and data availability and this paper will test the limits of the method with the most assumptions, the simplified calibration, by using data that violates the assumptions it's math model is based on and compares the results to the quasi-rigorous and rigorous techniques. The overarching goal is to provide a LiDAR system calibration that does not require raw measurements which can be carried out with minimal control and flight lines to reduce costs. This testing is unique because the terrain used for calibration does not contain gable roofs, all other LiDAR system calibration testing and development has been done with terrain containing features with high geometric integrity such as gable roofs.

  10. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  11. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  12. An evaluation of wind turbine blade cross section analysis techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Paquette, Joshua A.; Griffith, Daniel Todd; Laird, Daniel L.; Resor, Brian Ray

    2010-03-01

    The blades of a modern wind turbine are critical components central to capturing and transmitting most of the load experienced by the system. They are complex structural items composed of many layers of fiber and resin composite material and typically, one or more shear webs. Large turbine blades being developed today are beyond the point of effective trial-and-error design of the past and design for reliability is always extremely important. Section analysis tools are used to reduce the three-dimensional continuum blade structure to a simpler beam representation for use in system response calculations to support full system design and certification. One model simplification approach is to analyze the two-dimensional blade cross sections to determine the properties for the beam. Another technique is to determine beam properties using static deflections of a full three-dimensional finite element model of a blade. This paper provides insight into discrepancies observed in outputs from each approach. Simple two-dimensional geometries and three-dimensional blade models are analyzed in this investigation. Finally, a subset of computational and experimental section properties for a full turbine blade are compared.

  13. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  14. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  15. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  16. Comparative Analysis of Vehicle Make and Model Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Faiza Ayub Syed

    2014-03-01

    Full Text Available Vehicle Make and Model Recognition (VMMR has emerged as a significant element of vision based systems because of its application in access control systems, traffic control and monitoring systems, security systems and surveillance systems, etc. So far a number of techniques have been developed for vehicle recognition. Each technique follows different methodology and classification approaches. The evaluation results highlight the recognition technique with highest accuracy level. In this paper we have pointed out the working of various vehicle make and model recognition techniques and compare these techniques on the basis of methodology, principles, classification approach, classifier and level of recognition After comparing these factors we concluded that Locally Normalized Harris Corner Strengths (LHNS performs best as compared to other techniques. LHNS uses Bayes and K-NN classification approaches for vehicle classification. It extracts information from frontal view of vehicles for vehicle make and model recognition.

  17. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    Energy Technology Data Exchange (ETDEWEB)

    Amato, G. [ISTI-CNR, Area della Ricerca, Via Moruzzi 1, 56124, Pisa (Italy); Cristoforetti, G.; Legnaioli, S.; Lorenzetti, G.; Palleschi, V. [IPCF-CNR, Area della Ricerca, Via Moruzzi 1, 56124, Pisa (Italy); Sorrentino, F., E-mail: sorrentino@fi.infn.i [Dipartimento di Fisica e astronomia, Universita di Firenze, Polo Scientifico, via Sansone 1, 50019 Sesto Fiorentino (Italy); Istituto di Cibernetica CNR, via Campi Flegrei 34, 80078 Pozzuoli (Italy); Marwan Technology, c/o Dipartimento di Fisica ' E. Fermi' , Largo Pontecorvo 3, 56127 Pisa (Italy); Tognoni, E. [INO-CNR, Area della Ricerca, Via Moruzzi 1, 56124 Pisa (Italy)

    2010-08-15

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  18. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    Science.gov (United States)

    Amato, G.; Cristoforetti, G.; Legnaioli, S.; Lorenzetti, G.; Palleschi, V.; Sorrentino, F.; Tognoni, E.

    2010-08-01

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  19. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    International Nuclear Information System (INIS)

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  20. Analysis of Jugular Foramen Exposure in the Fallopian Bridge Technique

    OpenAIRE

    Satar, Bulent; Yazar, Fatih; Aykut CEYHAN; Arslan, Hasan Huseyin; Aydin, Sedat

    2009-01-01

    Objective: To analyze the exposure of the jugular foramen afforded by the fallopian bridge technique. Method: The jugular foramen exposure was obtained using the jugular foramen approach combined with the fallopian bridge technique. We applied this technique using 10 temporal bone specimens at a tertiary referral center. The exposure was assessed by means of depth of the dissection field and two separate dissection spaces that were created anteriorly and posteriorly to the facial nerve. Anter...

  1. Biomechanical analysis of cross-country skiing techniques.

    Science.gov (United States)

    Smith, G A

    1992-09-01

    The development of new techniques for cross-country skiing based on skating movements has stimulated biomechanical research aimed at understanding the various movement patterns, the forces driving the motions, and the mechanical factors affecting performance. Research methods have evolved from two-dimensional kinematic descriptions of classic ski techniques to three-dimensional analyses involving measurement of the forces and energy relations of skating. While numerous skiing projects have been completed, most have focused on either the diagonal stride or the V1 skating technique on uphill terrain. Current understanding of skiing mechanics is not sufficiently complete to adequately assess and optimize an individual skier's technique.

  2. Conformational Analysis of Misfolded Protein Aggregation by FRET and Live-Cell Imaging Techniques

    Directory of Open Access Journals (Sweden)

    Akira Kitamura

    2015-03-01

    Full Text Available Cellular homeostasis is maintained by several types of protein machinery, including molecular chaperones and proteolysis systems. Dysregulation of the proteome disrupts homeostasis in cells, tissues, and the organism as a whole, and has been hypothesized to cause neurodegenerative disorders, including amyotrophic lateral sclerosis (ALS and Huntington’s disease (HD. A hallmark of neurodegenerative disorders is formation of ubiquitin-positive inclusion bodies in neurons, suggesting that the aggregation process of misfolded proteins changes during disease progression. Hence, high-throughput determination of soluble oligomers during the aggregation process, as well as the conformation of sequestered proteins in inclusion bodies, is essential for elucidation of physiological regulation mechanism and drug discovery in this field. To elucidate the interaction, accumulation, and conformation of aggregation-prone proteins, in situ spectroscopic imaging techniques, such as Förster/fluorescence resonance energy transfer (FRET, fluorescence correlation spectroscopy (FCS, and bimolecular fluorescence complementation (BiFC have been employed. Here, we summarize recent reports in which these techniques were applied to the analysis of aggregation-prone proteins (in particular their dimerization, interactions, and conformational changes, and describe several fluorescent indicators used for real-time observation of physiological states related to proteostasis.

  3. Evaluation of Progressive Failure Analysis and Modeling of Impact Damage in Composite Pressure Vessels

    Science.gov (United States)

    Sanchez, Christopher M.

    2011-01-01

    NASA White Sands Test Facility (WSTF) is leading an evaluation effort in advanced destructive and nondestructive testing of composite pressure vessels and structures. WSTF is using progressive finite element analysis methods for test design and for confirmation of composite pressure vessel performance. Using composite finite element analysis models and failure theories tested in the World-Wide Failure Exercise, WSTF is able to estimate the static strength of composite pressure vessels. Additionally, test and evaluation on composites that have been impact damaged is in progress so that models can be developed to estimate damage tolerance and the degradation in static strength.

  4. Analysis of a proposed Compton backscatter imaging technique

    Science.gov (United States)

    Hall, James M.; Jacoby, Barry A.

    1994-03-01

    One-sided imaging techniques are currently being used in nondestructive evaluation of surfaces and shallow subsurface structures. In this work we present both analytical calculations and detailed Monte Carlo simulations aimed at assessing the capability of a proposed Compton backscattering imaging technique designed to detect and characterize voids located several centimeters below the surface of a solid.

  5. A Technique for the Analysis of Auto Exhaust.

    Science.gov (United States)

    Sothern, Ray D.; And Others

    Developed for presentation at the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971, this outline explains a technique for separating the complex mixture of hydrocarbons contained in automotive exhausts. A Golay column and subambient temperature programming technique are…

  6. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  7. Accident progression event tree analysis for postulated severe accidents at N Reactor

    International Nuclear Information System (INIS)

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied

  8. Analysis of the progressive failure of brittle matrix composites

    Science.gov (United States)

    Thomas, David J.

    1995-01-01

    This report investigates two of the most common modes of localized failures, namely, periodic fiber-bridged matrix cracks and transverse matrix cracks. A modification of Daniels' bundle theory is combined with Weibull's weakest link theory to model the statistical distribution of the periodic matrix cracking strength for an individual layer. Results of the model predictions are compared with experimental data from the open literature. Extensions to the model are made to account for possible imperfections within the layer (i.e., nonuniform fiber lengths, irregular crack spacing, and degraded in-situ fiber properties), and the results of these studies are presented. A generalized shear-lag analysis is derived which is capable of modeling the development of transverse matrix cracks in material systems having a general multilayer configuration and under states of full in-plane load. A method for computing the effective elastic properties for the damaged layer at the global level is detailed based upon the solution for the effects of the damage at the local level. This methodology is general in nature and is therefore also applicable to (0(sub m)/90(sub n))(sub s) systems. The characteristic stress-strain response for more general cases is shown to be qualitatively correct (experimental data is not available for a quantitative evaluation), and the damage evolution is recorded in terms of the matrix crack density as a function of the applied strain. Probabilistic effects are introduced to account for the statistical nature of the material strengths, thus allowing cumulative distribution curves for the probability of failure to be generated for each of the example laminates. Additionally, Oh and Finney's classic work on fracture location in brittle materials is extended and combined with the shear-lag analysis. The result is an analytical form for predicting the probability density function for the location of the next transverse crack occurrence within a crack bounded

  9. Glaucoma Progression Detection by Retinal Nerve Fiber Layer Measurement Using Scanning Laser Polarimetry: Event and Trend Analysis

    OpenAIRE

    Moon, Byung Gil; Sung, Kyung Rim; Cho, Jung Woo; Kang, Sung Yong; Yun, Sung-Cheol; Na, Jung Hwa; Lee, Youngrok; Kook, Michael S.

    2012-01-01

    Purpose To evaluate the use of scanning laser polarimetry (SLP, GDx VCC) to measure the retinal nerve fiber layer (RNFL) thickness in order to evaluate the progression of glaucoma. Methods Test-retest measurement variability was determined in 47 glaucomatous eyes. One eye each from 152 glaucomatous patients with at least 4 years of follow-up was enrolled. Visual field (VF) loss progression was determined by both event analysis (EA, Humphrey guided progression analysis) and trend analysis (TA,...

  10. Progression Analysis and Stage Discovery in Continuous Physiological Processes Using Image Computing

    Directory of Open Access Journals (Sweden)

    Ferrucci Luigi

    2010-01-01

    Full Text Available We propose an image computing-based method for quantitative analysis of continuous physiological processes that can be sensed by medical imaging and demonstrate its application to the analysis of morphological alterations of the bone structure, which correlate with the progression of osteoarthritis (OA. The purpose of the analysis is to quantitatively estimate OA progression in a fashion that can assist in understanding the pathophysiology of the disease. Ultimately, the texture analysis will be able to provide an alternative OA scoring method, which can potentially reflect the progression of the disease in a more direct fashion compared to the existing clinically utilized classification schemes based on radiology. This method can be useful not just for studying the nature of OA, but also for developing and testing the effect of drugs and treatments. While in this paper we demonstrate the application of the method to osteoarthritis, its generality makes it suitable for the analysis of other progressive clinical conditions that can be diagnosed and prognosed by using medical imaging.

  11. Progress of Vascular Cast Technique%人体血管铸型技术的研究进展

    Institute of Scientific and Technical Information of China (English)

    潘雪梅; 周军

    2012-01-01

    Objective:To explore the preparation of vascular cast and its application in anatomy and clinical medicine. Methods:' Vascular, cast, anatomy, vein and artery' were searched as key words by CNKI and PubMed series full - text database retrieval systems from Jan. 1991 to Dec. 2011. Total 2 046 Chinese papers and 197 English papers of literatures were obtained. Reading related literature,summaring progress of vascular cast technique. Results: Vascular cast is an established method of anatomical preparation which has built models showed artery and vein vascular network for diverseness organs. It shows the three - dimensional morphological of vessel. It provides a basis on the surgery preserving artery and the treatment of intervention in vascular. Conclusions:Vascular cast has been proven to be an excellent method for the examination of vessel. With effort for about 30 years, it has been matured. It has revived recently with the development of anatomy and clin-cical medicine.%目的:探讨血管铸型的制作及其在解剖学、临床等学科的应用进展.方法:登录CNKI及PubMed期刊全文数据库检索系统,以“血管、铸型、静脉、动脉”等为关键词,检索1991年1月~2011年12月的相关文献,共检索到中文文献2 046条,英文文献197条;阅读相关文献,并总结血管铸型技术及其应用进展.结果:血管铸型技术为解剖标本制作已确定的方法,已构建出全身多个器官的动脉、静脉血管网铸型标本.铸型标本可显示脉管系统的三维立体结构,可为临床各种保留血管的术式、血管内介入治疗等提供依据.结论:血管铸型是研究微血管系统的可靠手段,经过如多年的发展已日趋成熟.随着解剖学、临床等学科的发展,近年来有复兴的趋势.

  12. Single cell and single molecule techniques for the analysis of the epigenome

    Science.gov (United States)

    Wallin, Christopher Benjamin

    Epigenetic regulation is a critical biological process for the health and development of a cell. Epigenetic regulation is facilitated by covalent modifications to the underlying DNA and chromatin proteins. A fundamental understanding of these epigenetic modifications and their associated interactions at the molecular scale is necessary to explain phenomena including cellular identity, stem cell plasticity, and neoplastic transformation. It is widely known that abnormal epigenetic profiles have been linked to many diseases, most notably cancer. While the field of epigenetics has progressed rapidly with conventional techniques, significant advances remain to be made with respect to combinatoric analysis of epigenetic marks and single cell epigenetics. Therefore, in this dissertation, I will discuss our development of devices and methodologies to address these pertinent issues. First, we designed a preparatory polydimethylsiloxane (PDMS) microdevice for the extraction, purification, and stretching of human chromosomal DNA and chromatin from small cell populations down to a single cell. The valveless device captures cells by size exclusion within the micropillars, entraps the DNA or chromatin in the micropillars after cell lysis, purifies away the cellular debris, and fluorescently labels the DNA and/or chromatin all within a single reaction chamber. With the device, we achieve nearly 100% extraction efficiency of the DNA. The device is also used for in-channel immunostaining of chromatin followed by downstream single molecule chromatin analysis in nanochannels (SCAN). Second, using multi-color, time-correlated single molecule measurements in nanochannels, simultaneous coincidence detection of 2 epigenetic marks is demonstrated. Coincidence detection of 3 epigenetic marks is also established using a pulsed interleaved excitation scheme. With these two promising results, genome-wide quantification of epigenetic marks was pursued. Unfortunately, quantitative SCAN never

  13. Multidimensional scaling technique for analysis of magnetic storms at Indian observatories

    Indian Academy of Sciences (India)

    M Sridharan; A M S Ramasamy

    2002-12-01

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  14. SRAM standby leakage decoupling analysis for different leakage reduction techniques

    Institute of Scientific and Technical Information of China (English)

    Dong Qing; Lin Yinyin

    2013-01-01

    SRAM standby leakage reduction plays a pivotal role in minimizing the power consumption of application processors.Generally,four kinds of techniques are often utilized for SRAM standby leakage reduction:Vdd lowering (VDDL),Vss rising (VSSR),BL floating (BLF) and reversing body bias (RBB).In this paper,we comprehensively analyze and compare the reduction effects of these techniques on different kinds of leakage.It is disclosed that the performance of these techniques depends on the leakage composition of the SRAM cell and temperature.This has been verified on a 65 nm SRAM test macro.

  15. Surveillance of the nuclear instrumentation by a noise analysis technique

    International Nuclear Information System (INIS)

    The nuclear sensors used in the protection channels of a nuclear reactor, have to be tested periodically. A method has been developed to estimate the state of this kind of sensor. The method proposed applies to boron ionization chambers. The principle of this technique is based on the calculation of a specific parameter named a ''descriptor'', using a simple signal processing technique. A modification of this parameter indicates a degradation of the static and dynamic performances of the sensor. Different applications of the technique in a nuclear power plant are given

  16. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  17. Analysis Of Machine Learning Techniques By Using Blogger Data

    OpenAIRE

    Gowsalya.R,; S. Veni

    2014-01-01

    Blogs are the recent fast progressing media which depends on information system and technological advancement. The mass media is not much developed for the developing countries are in government terms and their schemes are developed based on governmental concepts, so blogs are provided for knowledge and ideas sharing. This article has highlighted and performed simulations from obtained information, 100 instances of Bloggers by using Weka 3. 6 Tool, and by applying many machine...

  18. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  19. Analysis of neutron-reflectometry data by Monte Carlo technique

    CERN Document Server

    Singh, S

    2002-01-01

    Neutron-reflectometry data is collected in momentum space. The real-space information is extracted by fitting a model for the structure of a thin-film sample. We have attempted a Monte Carlo technique to extract the structure of the thin film. In this technique we change the structural parameters of the thin film by simulated annealing based on the Metropolis algorithm. (orig.)

  20. Data Mining Techniques: A Source for Consumer Behavior Analysis

    OpenAIRE

    Abhijit Raorane; R.V. Kulkarni

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply...

  1. Application of ultrasonic pulse velocity technique and image analysis in monitoring of the sintering process

    Directory of Open Access Journals (Sweden)

    Terzić A.

    2011-01-01

    Full Text Available Concrete which undergoes a thermal treatment before and during its life-service can be applied in plants operating at high temperature and as thermal insulation. Sintering occurs within a concrete structure in such conditions. Progression of sintering process can be monitored by the change of the porosity parameters determined with a nondestructive test method - ultrasonic pulse velocity and computer program for image analysis. The experiment has been performed on the samples of corundum and bauxite concrete composites. The apparent porosity of the samples thermally treated at 110, 800, 1000, 1300 and 1500ºC was primary investigated with a standard laboratory procedure. Sintering parameters were calculated from the creep testing. The loss of strength and material degradation occurred in concrete when it was subjected to the increased temperature and a compressive load. Mechanical properties indicate and monitor changes within microstructure. The level of surface deterioration after the thermal treatment was determined using Image Pro Plus program. Mechanical strength was estimated using ultrasonic pulse velocity testing. Nondestructive ultrasonic measurement was used as a qualitative description of the porosity change in specimens which is the result of the sintering process. The ultrasonic pulse velocity technique and image analysis proved to be reliable methods for monitoring of microstructural change during the thermal treatment and service life of refractory concrete.

  2. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  3. Reticle defect sizing of optical proximity correction defects using SEM imaging and image analysis techniques

    Science.gov (United States)

    Zurbrick, Larry S.; Wang, Lantian; Konicek, Paul; Laird, Ellen R.

    2000-07-01

    Sizing of programmed defects on optical proximity correction (OPC) feature sis addressed using high resolution scanning electron microscope (SEM) images and image analysis techniques. A comparison and analysis of different sizing methods is made. This paper addresses the issues of OPC defect definition and discusses the experimental measurement results obtained by SEM in combination with image analysis techniques.

  4. Research Progress on Chemical Sterility Technique to Pest Control%化学不育技术在害虫防治中的研究进展

    Institute of Scientific and Technical Information of China (English)

    王建斌

    2011-01-01

    The study aimed to summarized the chemical sterility technique from the sterile principle,advantages,main kinds of sterilant,application of chemical sterility technique on pest control and research progress at home and abroad.%对化学不育技术的不育原理、优点、不育剂的主要类型、化学不育技术在害虫防治中的应用及国内外的研究进展进行了综述。

  5. Analysis of activity in open-source communities using social network analysis techniques

    OpenAIRE

    Martínez Torres, María del Rocío

    2014-01-01

    The success of an open-source software project is closely linked to the successful organization and development of the underlying virtual community. In particular, participation is the most important mechanism by which the development of the project is supported. The main objective of this paper is to analyse the online participation in virtual communities using social network analysis techniques in order to obtain the main patterns of behaviour of users within communities. Sev...

  6. Analysis Of Machine Learning Techniques By Using Blogger Data

    Directory of Open Access Journals (Sweden)

    Gowsalya.R,

    2014-04-01

    Full Text Available Blogs are the recent fast progressing media which depends on information system and technological advancement. The mass media is not much developed for the developing countries are in government terms and their schemes are developed based on governmental concepts, so blogs are provided for knowledge and ideas sharing. This article has highlighted and performed simulations from obtained information, 100 instances of Bloggers by using Weka 3. 6 Tool, and by applying many machine learning algorithms and analyzed with the values of accuracy, precision, recall and F-measure for getting future tendency anticipation of users to blogging and using in strategical areas. Keywords -

  7. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  8. COMPARISON AND ANALYSIS OF VARIOUS HISTOGRAM EQUALIZATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    MADKI.M.R

    2012-04-01

    Full Text Available The intensity histogram gives information which can be used for contrast enhancement. The histogram equalization could be flat for levels less than the total number of levels. This could deteriorate the image. This problem can be overcome various techniques. This paper gives a comparative of the Bi-Histogram Equalization, Recursive Mean Seperated Histogram Equalization, Multipeak Histogram Equalization and Brightness Preserving Dynamic Histogram Equalization techniques by using these techniques to test few standard images. The method of bi-histogram uses independent histogram over two separate subimages. The method of recursive uses several subimages. Multipeak Histogram Equalization detects the peaks in the histogram and the subimages are formed based on the number detected. The Brightness Preserving Dynamic Histogram Equalization improves contrast while it maintains the brightness of the image. We shall compare the results through the metric parameters of absolute mean Brightness error and peak signal to noise ratio.

  9. Experimental Analysis of Small Scale PCB Manufacturing Techniques for Fablabs

    Directory of Open Access Journals (Sweden)

    Yannick Verbelen

    2013-04-01

    Full Text Available In this paper we present a complete modular PCB manufacturing process on fablab scale that is compliant with current PCB manufacturing standards. This includes, but is not limited to, a minimum track width of 8 mil, a minimum clearance of 6 mil, plated and non plated holes, a solder resist, surface finish and component overlay. We modularize industrial manufacturing processes and discuss advantages and disadvantages of production techniques for every phase. We then proceed to discuss the relevance and added value of every phase in the manufacturing process and their usefulness in a fablab context. Production techniques are evaluated regarding complexity, overhead, safety, required time, and environmental concerns. To ensure practical feasibility of the presented techniques, the manufacturing process is benchmarked in FablabXL and aims to be a practical reference for implementing or extending PCB manufacturing activities in fablabs.

  10. Comparative study of Authorship Identification Techniques for Cyber Forensics Analysis

    Directory of Open Access Journals (Sweden)

    Smita Nirkhi

    2013-06-01

    Full Text Available Authorship Identification techniques are used to identify the most appropriate author from group of potential suspects of online messages and find evidences to support the conclusion. Cybercriminals make misuse of online communication for sending blackmail or a spam email and then attempt to hide their true identities to void detection.Authorship Identification of online messages is the contemporary research issue for identity tracing in cyber forensics. This is highly interdisciplinary area as it takes advantage of machine learning, information retrieval, and natural language processing. In this paper, a study of recent techniques and automated approaches to attributing authorship of online messages is presented. The focus of this review study is to summarize all existing authorship identification techniques used in literature to identify authors of online messages. Also it discusses evaluation criteria and parameters for authorship attribution studies and list open questions that will attract future work in this area.

  11. A quantitative analysis of rotary, ultrasonic and manual techniques to treat proximally flattened root canals

    Directory of Open Access Journals (Sweden)

    Fabiana Soares Grecca

    2007-04-01

    Full Text Available OBJECTIVE: The efficiency of rotary, manual and ultrasonic root canal instrumentation techniques was investigated in proximally flattened root canals. MATERIAL AND METHODS: Forty human mandibular left and right central incisors, lateral incisors and premolars were used. The pulp tissue was removed and the root canals were filled with red die. Teeth were instrumented using three techniques: (i K3 and ProTaper rotary systems; (ii ultrasonic crown-down technique; and (iii progressive manual technique. Roots were bisected longitudinally in a buccolingual direction. The instrumented canal walls were digitally captured and the images obtained were analyzed using the Sigma Scan software. Canal walls were evaluated for total canal wall area versus non-instrumented area on which dye remained. RESULTS: No statistically significant difference was found between the instrumentation techniques studied (p<0.05. CONCLUSION: The findings of this study showed that no instrumentation technique was 100% efficient to remove the dye.

  12. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  13. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  14. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  15. DATA MINING TECHNIQUES: A SOURCE FOR CONSUMER BEHAVIOR ANALYSIS

    Directory of Open Access Journals (Sweden)

    Abhijit Raorane

    2011-09-01

    Full Text Available Various studies on consumer purchasing behaviors have been presented and used in real problems. Datamining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, thedata mining method has disadvantages as well as advantages.Therefore, it is important to selectappropriate techniques to mine databases. The objective of this paper is to know consumer behavior, hispsychological condition at the time of purchase and how suitable data mining method apply to improveconventional method. Moreover, in an experiment, association rule is employed to mine rules for trustedcustomers using sales data in a super market industry

  16. [An analysis of key points for root canal therapy technique].

    Science.gov (United States)

    Fan, M W

    2016-08-01

    The success rate of root canal therapy(RCT)have been improved continuously along with the advancement in RCT techniques in the past several decades. If standard procedures of modern RCT techniques are strictly followed, the success rate of RCT may exceed 90%. The success of RCT is mainly affected by such factors as clear concept of the anatomy of root canals, proper mechanical and chemical preparation and perfect filling of root canal system. If these factors are sufficiently noted, a success is easy to achieve. Even though the primary RCT fails, retreatment can further be conducted to save the diseased teeth. PMID:27511032

  17. Sixth Australian conference on nuclear techniques of analysis: proceedings

    International Nuclear Information System (INIS)

    These proceedings contain the abstracts of 77 lectures. The topics focus on instrumentation, nuclear techniques and their applications for material science, surfaces, archaeometry, art, geological, environmental and biomedical studies. An outline of the Australian facilities available for research purposes is also provided. Separate abstracts were prepared for the individual papers in this volume

  18. Analysis on Poe's Unique Techniques to Achieve Aestheticism

    Institute of Scientific and Technical Information of China (English)

    孔佳鸣

    2008-01-01

    Edgar Allan Poe was one of the most important poets in the American poetic history for his unremitting pursuit for ‘ideal beauty'.This essay proves by various examples chosen from his poems that his aestheticism was obvious in his versification techniques.His poetic theory and practice gave an immortal example for the development of the English poetry.

  19. Analysis of ISO 26262 Compliant Techniques for the Automotive Domain

    NARCIS (Netherlands)

    Kannan, M. S.; Dajsuren, Y.; Luo, Y.; Barosan, I.

    2015-01-01

    The ISO 26262 standard denes functional safety for automotive E/E systems. Since the publication of the rst edition of this standard in 2011, many dierent safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the standard are targ

  20. Design and Testing Analysis of Requirement Prioritizations Technique

    Directory of Open Access Journals (Sweden)

    Dinesh Singh

    2015-11-01

    Full Text Available With the growing need of software in our day to day life, the complexity of the software is increasing as well and also the number of requirements associated to the modern software projects. So, in order to overcome the increasing demands and the pressure on the software engineers and program managers to deliver the software to the customers on time and in given budget, there is a huge need to identify the most important requirements and establish their relative importance for implementation according to certain criteria. The existing techniques for requirement prioritization although provide consistent results but are difficult to use and implement. Whereas some existing techniques that are easy to apply lack structure to analyze the complex requirements. Moreover the available techniques lack user friendliness in the prioritization process. So in order to overcome these issues or problems, a hybrid approach of two available techniques was proposed in our earlier work. In this paper we analyzed the design of the proposed system and testing plan of the system. Use case diagram and control flow diagram are used to explain the structure of the approach.

  1. Design and Testing Analysis of Requirement Prioritizations Technique

    Directory of Open Access Journals (Sweden)

    Dinesh Singh

    2014-06-01

    Full Text Available With the growing need of software in our day to day life, the complexity of the software is increasing as well and also the number of requirements associated to the modern software projects. So, in order to overcome the increasing demands and the pressure on the software engineers and program managers to deliver the software to the customers on time and in given budget, there is a huge need to identify the most important requirements and establish their relative importance for implementation according to certain criteria. The existing techniques for requirement prioritization although provide consistent results but are difficult to use and implement. Whereas some existing techniques that are easy to apply lack structure to analyze the complex requirements. Moreover the available techniques lack user friendliness in the prioritization process. So in order to overcome these issues or problems, a hybrid approach of two available techniques was proposed in our earlier work. In this paper we analyzed the design of the proposed system and testing plan of the system. Use case diagram and control flow diagram are used to explain the structure of the approach.

  2. New techniques for positron emission tomography in the study of human neurological disorders. Progress report, June 1990--June 1993

    Energy Technology Data Exchange (ETDEWEB)

    Kuhl, D.E.

    1993-06-01

    This progress report describes accomplishments of four programs. The four programs are entitled (1) Faster,simpler processing of positron-computing precursors: New physicochemical approaches, (2) Novel solid phase reagents and methods to improve radiosynthesis and isotope production, (3) Quantitative evaluation of the extraction of information from PET images, and (4) Optimization of tracer kinetic methods for radioligand studies in PET.

  3. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  4. 植物DNA条形码技术的发展及应用%Progress and application of DNA barcoding technique in plants

    Institute of Scientific and Technical Information of China (English)

    刘宇婧; 刘越; 黄耀江; 龙春林

    2011-01-01

    Based on summarization and analysis of development process of DNA barcoding technique,research progress of DNA barcoding technique in plants, its working process and analysis method,influencing factors on its identification accuracy, its application status and dispute existing in plant taxonomic study were comprehensively analyzed and described, and further development trend and application prospect of DNA barcoding technique in plants were proposed. By means of some research examples, it was indicated that combining method of DNA barcoding technique in plants and traditional botanical knowledge could be taken as one of studying ways for ethnobotany. And also, it was suggested that common DNA barcoding in plants mainly were two modes of single fragment and multiple fragments combining, and both of them had their respective advantages and disadvantages. Common DNA sequences included matK, trnH-psbA, rbcL and ITS, etc, but they all had a certain limitation. Different standards of DNA barcoding in plants should be selected in order to different application aims.Influencing factors on its identification accuracy included type and number of species, construction method of phylogenetic tree, hybridization and gene introgression, variance of species origin time and variance of molecular evolution rate. Current focus on DNA barcoding in plants is how to select suitable DNA fragments and to evaluate their values.%在对DNA条形码技术的发展过程进行归纳分析的基础上,对植物DNA条形码技术的研究进展、工作流程及分析方法、影响其鉴定准确性的因素及其在植物分类学研究中的应用现状及存在的争议进行了综合分析和阐述,并展望了植物DNA条形码技术的发展趋势及应用前景.通过具体实例说明将植物DNA条形码技术与传统植物学知识相结合可作为民族植物学的研究手段之一.认为:目前常用的植物DNA条形码主要有单一片段和多片段组合2种方式,这2种

  5. Rates of progression in diabetic retinopathy during different time periods: a systematic review and meta-analysis

    DEFF Research Database (Denmark)

    Wong, Tien Y; Mwamburi, Mkaya; Klein, Ronald;

    2009-01-01

    This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends.......This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends....

  6. Qualitative analysis of the elliptical centric technique and the TRICKS technique

    Science.gov (United States)

    Dong, Kyung-Rae; Goo, Eun-Hoe; Lee, Jae-Seung; Chung, Woon-Kwan

    2013-02-01

    This study evaluated the usefulness of time resolved imaging of contrast kinetics (TRICKS) magnetic resonance angiography (MRA) and elliptical centric MRA according to the type of cerebral disease. From February 2010 to January 2012, elliptical centric MRA and TRICKS MRA images were acquired from 50 normal individuals and 50 patients with cerebral diseases by using 3.0-Tesla magnetic resonance imaging (MRI) equipment. The images were analyzed qualitatively by examining areas such as the presence or absence of artifacts on the images, the distinctness of boundaries of blood vessels, accurate representation of the lesions, and the subtraction level. In addition, the sensitivity, specificity, positive prediction rate, negative prediction rate and accuracy were assessed by comparing the diagnostic efficacy of the two techniques. The results revealed TRICKS MRA to have superior image quality to elliptical centric MRA. Regarding each disease, TRICKS MRA showed higher diagnostic efficacy for artery venous malformation (AVM) and middle cerebral artery (MCA) bypass patients whereas elliptical centric MRA was more suitable for patients with brain tumors, cerebral infarction, cerebral stenosis or sinus mass.

  7. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  8. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... of the offeror's cost trends, on the basis of current and historical cost or pricing data; (C... required. (2) Price analysis shall be used when certified cost or pricing data are not required (see paragraph (b) of this subsection and 15.404-3). (3) Cost analysis shall be used to evaluate...

  9. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  10. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  11. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  12. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  13. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  14. Analysis of kidney stones by PIXE and RBS techniques

    International Nuclear Information System (INIS)

    Human kidney stones were analyzed by PIXE and RBS techniques using 2 MeV He++ beam. The stones were found to contain the elements: C, N, O, F, Na, Mg, Si, P, S, Cl, K, Ca, Fe and Br. Results obtained by PIXE agree with the results obtained by RBS within experimental errors. A Mechanism for the formation of the kidney stones is suggested. 3 figs., 1 tab

  15. Document analysis by means of data mining techniques

    OpenAIRE

    Jabeen, Saima

    2014-01-01

    The huge amount of textual data produced everyday by scientists, journalists and Web users, allows investigating many different aspects of information stored in the published documents. Data mining and information retrieval techniques are exploited to manage and extract information from huge amount of unstructured textual data. Text mining also known as text data mining is the processing of extracting high quality information (focusing relevance, novelty and interestingness) from text by iden...

  16. Analysis of deployment techniques for webbased applications in SMEs

    OpenAIRE

    Browne, Cathal

    2011-01-01

    The Internet is no longer just a source for accessing information; it has become a valuable medium for social networking and software services. Web-browsers can now access entire software systems available online to provide the user with a range of services. The concept of software as a service(SAAS) was born out of this. The number of development techniques and frameworks for such web-applications has grown rapidly and much research and development has been carried out on adva...

  17. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    OpenAIRE

    S.G.S. Fernando; S.N. Perera

    2015-01-01

    Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With...

  18. Empirical Analysis of Data Mining Techniques for Social Network Websites

    OpenAIRE

    S.G.S. Fernando; MdGaparMdJohar; S.N. Perera

    2014-01-01

    Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of ap...

  19. Reduced-Order Blade Mistuning Analysis Techniques Developed for the Robust Design of Engine Rotors

    Science.gov (United States)

    Min, James B.

    2004-01-01

    The primary objective of this research program is to develop vibration analysis tools, design tools, and design strategies to significantly improve the safety and robustness of turbine engine rotors. Bladed disks in turbine engines always feature small, random blade-to-blade differences, or mistuning. Mistuning can lead to a dramatic increase in blade forced-response amplitudes and stresses. Ultimately, this results in high-cycle fatigue, which is a major safety and cost concern. In this research program, the necessary steps will be taken to transform a state-of-the-art vibration analysis tool, the Turbo-Reduce forced-response prediction code, into an effective design tool by enhancing and extending the underlying modeling and analysis methods. Furthermore, novel techniques will be developed to assess the safety of a given design. In particular, a procedure will be established for using eigenfrequency curve veerings to identify "danger zones" in the operating conditions--ranges of rotational speeds and engine orders in which there is a great risk that the rotor blades will suffer high stresses. This work also will aid statistical studies of the forced response by reducing the necessary number of simulations. Finally, new strategies for improving the design of rotors will be pursued. Several methods will be investigated, including the use of intentional mistuning patterns to mitigate the harmful effects of random mistuning, and the modification of disk stiffness to avoid reaching critical values of interblade coupling in the desired operating range. Recent research progress is summarized in the following paragraphs. First, significant progress was made in the development of the component mode mistuning (CMM) and static mode compensation (SMC) methods for reduced-order modeling of mistuned bladed disks (see the following figure). The CMM method has been formalized and extended to allow a general treatment of mistuning. In addition, CMM allows individual mode

  20. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  1. An ASIC Low Power Primer Analysis, Techniques and Specification

    CERN Document Server

    Chadha, Rakesh

    2013-01-01

    This book provides an invaluable primer on the techniques utilized in the design of low power digital semiconductor devices.  Readers will benefit from the hands-on approach which starts form the ground-up, explaining with basic examples what power is, how it is measured and how it impacts on the design process of application-specific integrated circuits (ASICs).  The authors use both the Unified Power Format (UPF) and Common Power Format (CPF) to describe in detail the power intent for an ASIC and then guide readers through a variety of architectural and implementation techniques that will help meet the power intent.  From analyzing system power consumption, to techniques that can employed in a low power design, to a detailed description of two alternate standards for capturing the power directives at various phases of the design, this book is filled with information that will give ASIC designers a competitive edge in low-power design. Starts from the ground-up and explains what power is, how it is measur...

  2. Annual progress report 1981

    International Nuclear Information System (INIS)

    This annual progress report of the CEA Protection and Nuclear Safety Institut outlines a brief description of the progress made in each section of the Institut. Research activities of the Protection department include, radiation effects on man, radioecology and environment radioprotection techniques. Research activities of the Nuclear Safety department include, reactor safety analysis, fuel cycle facilities safety analysis, safety research programs. The third section deals with nuclear material security including security of facilities, security of nuclear material transport and monitoring of nuclear material management

  3. Recent Progresses in Analysis of Tongue Manifestation for Traditional Chinese Medicine

    Institute of Scientific and Technical Information of China (English)

    WEI Bao-guo; CAI Yi-heng; ZHANG Xin-feng; SHEN Lan-sun

    2005-01-01

    Tongue diagnosis is one of the most precious and widely used diagnostic methods in Traditional Chinese Medicine (TCM). However, due to its subjective, qualitative and experience-dependent nature, the studies on tongue characterization have been widely emphasized. This paper surveys recent progresses in analysis of tongue manifestation. These new developments include the cross-network and cross-media color reproduction of tongue image, the automatic segmentation of tongue body based on knowledge, the automatic analysis of curdiness and griminess for the tongue fur and the automatic analysis of plumpness, wryness and dot -thorn of tongue body. The clinic experiments verify the validity of these new methods.

  4. Assessing Progress towards Public Health, Human Rights, and International Development Goals Using Frontier Analysis.

    Science.gov (United States)

    Luh, Jeanne; Cronk, Ryan; Bartram, Jamie

    2016-01-01

    Indicators to measure progress towards achieving public health, human rights, and international development targets, such as 100% access to improved drinking water or zero maternal mortality ratio, generally focus on status (i.e., level of attainment or coverage) or trends in status (i.e., rates of change). However, these indicators do not account for different levels of development that countries experience, thus making it difficult to compare progress between countries. We describe a recently developed new use of frontier analysis and apply this method to calculate country performance indices in three areas: maternal mortality ratio, poverty headcount ratio, and primary school completion rate. Frontier analysis is used to identify the maximum achievable rates of change, defined by the historically best-performing countries, as a function of coverage level. Performance indices are calculated by comparing a country's rate of change against the maximum achievable rate at the same coverage level. A country's performance can be positive or negative, corresponding to progression or regression, respectively. The calculated performance indices allow countries to be compared against each other regardless of whether they have only begun to make progress or whether they have almost achieved the target. This paper is the first to use frontier analysis to determine the maximum achievable rates as a function of coverage level and to calculate performance indices for public health, human rights, and international development indicators. The method can be applied to multiple fields and settings, for example health targets such as cessation in smoking or specific vaccine immunizations, and offers both a new approach to analyze existing data and a new data source for consideration when assessing progress achieved.

  5. Assessing Progress towards Public Health, Human Rights, and International Development Goals Using Frontier Analysis.

    Science.gov (United States)

    Luh, Jeanne; Cronk, Ryan; Bartram, Jamie

    2016-01-01

    Indicators to measure progress towards achieving public health, human rights, and international development targets, such as 100% access to improved drinking water or zero maternal mortality ratio, generally focus on status (i.e., level of attainment or coverage) or trends in status (i.e., rates of change). However, these indicators do not account for different levels of development that countries experience, thus making it difficult to compare progress between countries. We describe a recently developed new use of frontier analysis and apply this method to calculate country performance indices in three areas: maternal mortality ratio, poverty headcount ratio, and primary school completion rate. Frontier analysis is used to identify the maximum achievable rates of change, defined by the historically best-performing countries, as a function of coverage level. Performance indices are calculated by comparing a country's rate of change against the maximum achievable rate at the same coverage level. A country's performance can be positive or negative, corresponding to progression or regression, respectively. The calculated performance indices allow countries to be compared against each other regardless of whether they have only begun to make progress or whether they have almost achieved the target. This paper is the first to use frontier analysis to determine the maximum achievable rates as a function of coverage level and to calculate performance indices for public health, human rights, and international development indicators. The method can be applied to multiple fields and settings, for example health targets such as cessation in smoking or specific vaccine immunizations, and offers both a new approach to analyze existing data and a new data source for consideration when assessing progress achieved. PMID:26812524

  6. Statistical analysis of heartbeat data with wavelet techniques

    Science.gov (United States)

    Pazsit, Imre

    2004-05-01

    The purpose of this paper is to demonstrate the use of some methods of signal analysis, performed on ECG and in some cases blood pressure signals, for the classification of the health status of the heart of mice and rats. Spectral and wavelet analysis were performed on the raw signals. FFT-based coherence and phase was also calculated between blood pressure and raw ECG signals. Finally, RR-intervals were deduced from the ECG signals and an analysis of the fractal dimensions was performed. The analysis was made on data from mice and rats. A correlation was found between the health status of the mice and the rats and some of the statistical descriptors, most notably the phase of the cross-spectra between ECG and blood pressure, and the fractal properties and dimensions of the interbeat series (RR-interval fluctuations).

  7. Automated image analysis techniques for cardiovascular magnetic resonance imaging

    NARCIS (Netherlands)

    Geest, Robertus Jacobus van der

    2011-01-01

    The introductory chapter provides an overview of various aspects related to quantitative analysis of cardiovascular MR (CMR) imaging studies. Subsequently, the thesis describes several automated methods for quantitative assessment of left ventricular function from CMR imaging studies. Several novel

  8. Statistical Performance Analysis and Modeling Techniques for Nanometer VLSI Designs

    CERN Document Server

    Shen, Ruijing; Yu, Hao

    2012-01-01

    Since process variation and chip performance uncertainties have become more pronounced as technologies scale down into the nanometer regime, accurate and efficient modeling or characterization of variations from the device to the architecture level have  become imperative for the successful design of VLSI chips. This book provides readers with tools for variation-aware design methodologies and computer-aided design (CAD) of VLSI systems, in the presence of process variations at the nanometer scale. It presents the latest developments for modeling and analysis, with a focus on statistical interconnect modeling, statistical parasitic extractions, statistical full-chip leakage and dynamic power analysis considering spatial correlations, statistical analysis and modeling for large global interconnects and analog/mixed-signal circuits.  Provides readers with timely, systematic and comprehensive treatments of statistical modeling and analysis of VLSI systems with a focus on interconnects, on-chip power grids and ...

  9. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  10. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente;

    2005-01-01

    aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection...... the use of short columns. Hence, analysis times could be halved without loss of separation efficiency in this biological sample...

  11. General Approach To Materials Classification Using Neutron Analysis Techniques

    International Nuclear Information System (INIS)

    The 'neutron in, gamma out' method of elemental analysis has been known and used in many applications as an elemental analysis tool. This method is non-intrusive, non-destructive, fast and precise. This set of advantages makes neutron analysis attractive for even wider variety of uses beyond simple elemental analysis. The question that is addressed within this study is under what conditions neutron analysis can be used to differentiate materials of interest from a group or class of materials in the face of knowing that what is truly of interest is the molecular content of any sample under interrogation. Purpose of the study was to develop a neutron-based scanner for rapid differentiation of classes of materials sealed in small bottles. Developed scanner employs D-T neutron generator as a neutron source and HPGe gamma detectors. Materials can be placed into classes by many different properties. However, neutron analysis method can be used only few of them, such as elemental content, stoichiometric ratios and density of the scanned material. Set of parameters obtainable through neutron analysis serves as a basis for a hyperspace, where each point corresponds to a certain scanned material. Sub-volumes of the hyperspace correspond to different classes of materials. One of the most important properties of the materials are stoichiometric ratios of the elements comprising the materials. Constructing an algorithm for converting the observed gamma ray counts into quantities of the elements in the scanned sample is a crucial part of the analysis. Gamma rays produced in both fast inelastic scatterings and neutron captures are considered. Presence of certain elements in materials, such as hydrogen and chlorine can significantly change neutron dynamics within the sample, and, in turn, characteristic gamma lines development. These effects have been studied and corresponding algorithms have been developed to account for them

  12. Analysis of the changes in keratoplasty indications and preferred techniques.

    Directory of Open Access Journals (Sweden)

    Stefan J Lang

    Full Text Available Recently, novel techniques introduced to the field of corneal surgery, e.g. Descemet membrane endothelial keratoplasty (DMEK and corneal crosslinking, extended the therapeutic options. Additionally contact lens fitting has developed new alternatives. We herein investigated, whether these techniques have affected volume and spectrum of indications of keratoplasties in both a center more specialized in treating Fuchs' dystrophy (center 1 and a second center that is more specialized in treating keratoconus (center 2.We retrospectively reviewed the waiting lists for indication, transplantation technique and the patients' travel distances to the hospital at both centers.We reviewed a total of 3778 procedures. Fuchs' dystrophy increased at center 1 from 17% (42 to 44% (150 and from 13% (27 to 23% (62 at center 2. In center 1, DMEK increased from zero percent in 2010 to 51% in 2013. In center 2, DMEK was not performed until 2013. The percentage of patients with keratoconus slightly decreased from 15% (36 in 2009 vs. 12% (40 in 2013 in center 1. The respective percentages in center 2 were 28% (57 and 19% (51. In both centers, the patients' travel distances increased.The results from center 1 suggest that DMEK might increase the total number of keratoplasties. The increase in travel distance suggests that this cannot be fully attributed to recruiting the less advanced patients from the hospital proximity. The increase is rather due to more referrals from other regions. The decrease of keratoconus patients in both centers is surprising and may be attributed to optimized contact lens fitting or even to the effect corneal crosslinking procedure.

  13. Analysis and RHBD technique of single event transients in PLLs

    International Nuclear Information System (INIS)

    Single-event transient susceptibility of phase-locked loops has been investigated. The charge pump is the most sensitive component of the PLL to SET, and it is hard to mitigate this effect at the transistor level. A test circuit was designed on a 65 nm process using a new system-level radiation-hardening-by-design technique. Heavy-ion testing was used to evaluate the radiation hardness. Analyses and discussion of the feasibility of this method are also presented. (paper)

  14. Analysis of non-linearity in differential wavefront sensing technique.

    Science.gov (United States)

    Duan, Hui-Zong; Liang, Yu-Rong; Yeh, Hsien-Chi

    2016-03-01

    An analytical model of a differential wavefront sensing (DWS) technique based on Gaussian Beam propagation has been derived. Compared with the result of the interference signals detected by quadrant photodiode, which is calculated by using the numerical method, the analytical model has been verified. Both the analytical model and numerical simulation show milli-radians level non-linearity effect of DWS detection. In addition, the beam clipping has strong influence on the non-linearity of DWS. The larger the beam clipping is, the smaller the non-linearity is. However, the beam walking effect hardly has influence on DWS. Thus, it can be ignored in laser interferometer. PMID:26974079

  15. Comparative Analysis of Different Fabric Defects Detection Techniques

    Directory of Open Access Journals (Sweden)

    Ali Javed

    2013-01-01

    Full Text Available In last few years’ different textile companies aim to produce the quality fabrics. Major loss of any textile oriented company occurs due to defective fabrics. So the detection of faulty fabrics plays an important role in the success of any company. Till now most of the inspection is done using human visual. This way is too much time consuming, cumbersome and prone to human errors. In past, many advances are made in developing automated and computerized systems to reduce cost and time whereas, increasing the efficiency of the process. This paper aims at comparing some of these techniques on the basis of classification methods and accuracy.

  16. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  17. Improved X-ray diagnosis of stomach by progress in the development of contrast media and examination techniques

    International Nuclear Information System (INIS)

    Three factors have been responsible for the advances during the past few years in X-ray examination of the stomach: Improvement of the contrast media used; introduction of the rare-earth foils; and examination techniques imaging all sections of the stomach and of the duodenal bulb under hypotension in double-contrast technique, in complete filling, and imaging the accessible sections by means of proper compression. An interesting technique employs a combination of two different barium sulphate suspension used at the same time, e.g. Bubbly Barium or some other barium sulphate preparation with a small amount of High-Density Barium yielding excellent image of the gastric mucosa (technique with two contrast media). (orig.)

  18. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  19. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  20. Techniques of EMG signal analysis: detection, processing, classification and applications

    Science.gov (United States)

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  1. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F.; Rovani, Jr., Joseph F.

    2011-01-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  2. Cost-variance analysis by DRGs; a technique for clinical budget analysis.

    Science.gov (United States)

    Voss, G B; Limpens, P G; Brans-Brabant, L J; van Ooij, A

    1997-02-01

    In this article it is shown how a cost accounting system based on DRGs can be valuable in determining changes in clinical practice and explaining alterations in expenditure patterns from one period to another. A cost-variance analysis is performed using data from the orthopedic department from the fiscal years 1993 and 1994. Differences between predicted and observed cost for medical care, such as diagnostic procedures, therapeutic procedures and nursing care are analyzed into different components: changes in patient volume, case-mix differences, changes in resource use and variations in cost per procedure. Using a DRG cost accounting system proved to be a useful technique for clinical budget analysis. Results may stimulate discussions between hospital managers and medical professionals to explain cost variations integrating medical and economic aspects of clinical health care. PMID:10165044

  3. A New Computer-Aided Technique for Qualitative Document Analysis

    Science.gov (United States)

    Morris, David; Ecclesfield, Nigel

    2011-01-01

    The ever-increasing production of digital textual data in a wide variety of forms presents both opportunities and challenges to researchers. The opportunities derive from the rich availability of secondary data with which to work. The challenges are the familiar ones of lack of time and resources to undertake the analysis. The qualitative…

  4. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...

  5. Spectral analysis and filtering techniques in digital spatial data processing

    Science.gov (United States)

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  6. Radio & Optical Interferometry: Basic Observing Techniques and Data Analysis

    CERN Document Server

    Monnier, John D

    2012-01-01

    Astronomers usually need the highest angular resolution possible, but the blurring effect of diffraction imposes a fundamental limit on the image quality from any single telescope. Interferometry allows light collected at widely-separated telescopes to be combined in order to synthesize an aperture much larger than an individual telescope thereby improving angular resolution by orders of magnitude. Radio and millimeter wave astronomers depend on interferometry to achieve image quality on par with conventional visible and infrared telescopes. Interferometers at visible and infrared wavelengths extend angular resolution below the milli-arcsecond level to open up unique research areas in imaging stellar surfaces and circumstellar environments. In this chapter the basic principles of interferometry are reviewed with an emphasis on the common features for radio and optical observing. While many techniques are common to interferometers of all wavelengths, crucial differences are identified that will help new practi...

  7. Practice patterns in FNA technique: A survey analysis

    Institute of Scientific and Technical Information of China (English)

    Christopher; J; DiMaio; Jonathan; M; Buscaglia; Seth; A; Gross; Harry; R; Aslanian; Adam; J; Goodman; Sammy; Ho; Michelle; K; Kim; Shireen; Pais; Felice; Schnoll-Sussman; Amrita; Sethi; Uzma; D; Siddiqui; David; H; Robbins; Douglas; G; Adler; Satish; Nagula

    2014-01-01

    AIM: To ascertain fine needle aspiration(FNA) tech-niques by endosonographers with varying levels of ex-perience and environments.METHODS: A survey study was performed on United States based endosonographers. The subjects complet-ed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and prac-tice environment.RESULTS: A total of 210(30.8%) endosonographers completed the survey. Just over half(51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents(77.1%) identified themselves as high-volume endoscopic ultrasound(EUS)(> 150 EUS/year) and high-volume FNA(> 75 FNA/year) performers(73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle(60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy,(33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle(66.7%) compared to community physicians(40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment.

  8. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    Science.gov (United States)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  9. Correlation analysis of energy indicators for sustainable development using multivariate statistical techniques

    International Nuclear Information System (INIS)

    Energy is an essential input for social development and economic growth. The production and use of energy cause environmental degradation at all levels, being local, regional and global such as, combustion of fossil fuels causing air pollution; hydropower often causes environmental damage due to the submergence of large areas of land; and global climate change associated with the increasing concentration of greenhouse gases in the atmosphere. As mentioned in chapter 9 of Agenda 21, the Energy is essential to economic and social development and improved quality of life. Much of the world's energy, however, is currently produced and consumed in ways that could not be sustained if technologies were remain constant and if overall quantities were to increase substantially. All energy sources will need to be used in ways that respect the atmosphere, human health, and the environment as a whole. The energy in the context of sustainable development needs a set of quantifiable parameters, called indicators, to measure and monitor important changes and significant progress towards the achievement of the objectives of sustainable development policies. The indicators are divided into four dimensions: social, economic, environmental and institutional. This paper shows a methodology of analysis using Multivariate Statistical Technique that provide the ability to analyse complex sets of data. The main goal of this study is to explore the correlation analysis among the indicators. The data used on this research work, is an excerpt of IBGE (Instituto Brasileiro de Geografia e Estatistica) data census. The core indicators used in this study follows The IAEA (International Atomic Energy Agency) framework: Energy Indicators for Sustainable Development. (author)

  10. Myocardial perfusion scintigraphy using a new technique - the mesh chamber. Progress report, May 1, 1982-November 1, 1983

    International Nuclear Information System (INIS)

    Work since the last progress report has concentrated in three major areas: development and construction of a multimodule slice type detector system and associated electronics at Massachusetts Institute of Technology (MIT); software development for data acquisition and reconstruction algorithms as implemented on the PDP-11/34 system at Brigham and Women's Hospital (BWH); testing of a system of 10 cm x 10 cm area cameras at BWH. The software development includes a method which we believe represents a unique approach to reconstruction of images. We report on this work in more detail in subsequent sections of this report

  11. A study of atriphos (ATP) action on muscular circulation in progressive muscular dystrophy by the radioactive xenon clearance technique

    International Nuclear Information System (INIS)

    The effect of intramuscularly and intravenously adminostered atriphos on the muscular circulation was studied with radioactive xenon in 12 children with progressive muscular dystrophy. After combined local intramuscular injection of ATP (atriphos) with the radioactive marker a 12-fold increment of muscular circulation ensues, lasting about 15 minutes. No vasodilatating effect on the muscular flow was oberved after intravenous injection of 20-40 mg of atriphos. It is believed that intramuscular administration of atriphos produced dilatation of capillaries and of the venous part of the muscular circulation. (author)

  12. 色谱联用技术的研究进展%Research Progress of Hypenated Chromatographic Techniques

    Institute of Scientific and Technical Information of China (English)

    杨洁; 索习东

    2012-01-01

    综述了色谱联用技术的发展概况及其在各种研究中的应用和薪进展,同时指出随着色谱联用技术的不断发展,各种新型色谱技术在现代研究中将有更大的发展空间与应用前景。%This article summarized the development of hyphenated Chromatography would techniques and its applications and new developments in various studies. Meanwhile it indicated that there would be greater room and application prospects for new types of chromatographic techniques in modern researches with the continuous development of hyphenated Chromatographic techniques.

  13. Transient analysis techniques in performing impact and crash dynamic studies

    Science.gov (United States)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  14. PERFORMANCE ANALYSIS OF SOFT COMPUTING TECHNIQUES FOR CLASSIFYING CARDIAC ARRHYTHMIA

    Directory of Open Access Journals (Sweden)

    R GANESH KUMAR

    2014-01-01

    Full Text Available Cardiovascular diseases kill more people than other diseases. Arrhythmia is a common term used for cardiac rhythm deviating from normal sinus rhythm. Many heart diseases are detected through electrocardiograms (ECG analysis. Manual analysis of ECG is time consuming and error prone. Thus, an automated system for detecting arrhythmia in ECG signals gains importance. Features are extracted from time series ECG data with Discrete Cosine Transform (DCT computing the distance between RR waves. The feature is the beat’s extracted RR interval. Frequency domain extracted features are classified using Classification and Regression Tree (CART, Radial Basis Function (RBF, Support Vector Machine (SVM and Multilayer Perceptron Neural Network (MLP-NN. Experiments were conducted on the MIT-BIH arrhythmia database.

  15. Biomechanical analysis technique choreographic movements (for example, "grand battman jete"

    Directory of Open Access Journals (Sweden)

    Batieieva N.P.

    2015-04-01

    Full Text Available Purpose : biomechanical analysis of the execution of choreographic movement "grand battman jete". Material : the study involved students (n = 7 of the department of classical choreography faculty of choreography. Results : biomechanical analysis of choreographic movement "grand battman jete" (classic exercise, obtained kinematic characteristics (path, velocity, acceleration, force of the center of mass (CM bio parts of the body artist (foot, shin, thigh. Built bio kinematic model (phase. The energy characteristics - mechanical work and kinetic energy units legs when performing choreographic movement "grand battman jete". Conclusions : It was found that the ability of an athlete and coach-choreographer analyze the biomechanics of movement has a positive effect on the improvement of choreographic training of qualified athletes in gymnastics (sport, art, figure skating and dance sports.

  16. Advanced Techniques in Pulmonary Function Test Analysis Interpretation and Diagnosis

    OpenAIRE

    Gildea, T.J.; Bell, C. William

    1980-01-01

    The Pulmonary Functions Analysis and Diagnostic System is an advanced clinical processing system developed for use at the Pulmonary Division, Department of Medicine at the University of Nebraska Medical Center. The system generates comparative results and diagnostic impressions for a variety of routine and specialized pulmonary functions test data. Routine evaluation deals with static lung volumes, breathing mechanics, diffusing capacity, and blood gases while specialized tests include lung c...

  17. Instrumental Neutron Activation Analysis Technique using Subsecond Radionuclides

    DEFF Research Database (Denmark)

    Nielsen, H.K.; Schmidt, J.O.

    1987-01-01

    The fast irradiation facility Mach-1 installed at the Danish DR 3 reactor has been used in boron determinations by means of Instrumental Neutron Activation Analysis using12B with 20-ms half-life. The performance characteristics of the system are presented and boron determinations of NBS standard...... reference materials as well as fertilizer materials are compared by literature value and spectrophotometric measurements, respectively. In both cases good agreement is obtained....

  18. Applications of string mining techniques in text analysis

    OpenAIRE

    Horațiu Mocian

    2012-01-01

    The focus of this project is on the algorithms and data structures used in string mining and their applications in bioinformatics, text mining and information retrieval. More specific, it studies the use of suffix trees and suffix arrays for biological sequence analysis, and the algorithms used for approximate string matching, both general ones and specialized ones used in bioinformatics, like the BLAST algorithm and PAM substitution matrix. Also, an attempt is made to apply these structures ...

  19. Methods and techniques for bio-system's materials behaviour analysis

    OpenAIRE

    MITU, LEONARD GABRIEL

    2014-01-01

    In the context of the rapid development of the research in the biosystem structure materials domain, a representative direction shows the analysis of the behavior of these materials. This direction of research requires the use of various means and methods for theoretical and experimental measuring and evaluating. PhD thesis "Methods and means for analyzing the behavior of biosystems structure materials" shall be given precedence in this area of research, aimed at studying the behavior of poly...

  20. A genetic analysis of Adh1 regulation. Progress report, June 1991--February 1992

    Energy Technology Data Exchange (ETDEWEB)

    Freeling, M.

    1992-03-01

    The overall goal of our research proposal is to understand the meaning of the various cis-acting sites responsible for AdH1 expression in the entire maize plant. Progress is reported in the following areas: Studies on the TATA box and analysis of revertants of the Adh1-3F1124 allele; screening for more different mutants that affect Adh1 expression differentially; studies on cis-acting sequences required for root-specific Adh1 expression; refinement of the use of the particle gun; and functional analysis of a non- glycolytic anaerobic protein.

  1. 紫外可见分光光度技术的应用进展%Recent Progress of Application of UV-Vis Spectrophotometric Technique

    Institute of Scientific and Technical Information of China (English)

    王海军; 宁新霞

    2012-01-01

    介绍了近几年来紫外可见分光光度技术在仪器部件、多组分体系的测定,新技术应用进展以及与其他技术联用等方面的情况(引用文献33篇)。%An introduction on the recent progress of application of UVVis spectrophotometric technique was presented covering mainly the last several years, relating especially to parts of instrument, determination of multi- component systems, application of new technology and it's hyphenation with other technologies (33 ref. cited).

  2. Sensitivity analysis and performance estimation of refractivity from clutter techniques

    Science.gov (United States)

    Yardim, Caglar; Gerstoft, Peter; Hodgkiss, William S.

    2009-02-01

    Refractivity from clutter (RFC) refers to techniques that estimate the atmospheric refractivity profile from radar clutter returns. A RFC algorithm works by finding the environment whose simulated clutter pattern matches the radar measured one. This paper introduces a procedure to compute RFC estimator performance. It addresses the major factors such as the radar parameters, the sea surface characteristics, and the environment (region, time of the day, season) that affect the estimator performance and formalizes an error metric combining all of these. This is important for applications such as calculating the optimal radar parameters, selecting the best RFC inversion algorithm under a set of conditions, and creating a regional performance map of a RFC system. The performance metric is used to compute the RFC performance of a non-Bayesian evaporation duct estimator. A Bayesian estimator that incorporates meteorological statistics in the inversion is introduced and compared to the non-Bayesian estimator. The performance metric is used to determine the optimal radar parameters of the evaporation duct estimator for six scenarios. An evaporation duct inversion performance map for a S band radar is created for the larger Mediterranean/Arabian Sea region.

  3. Chromatographic finger print analysis of Naringi crenulata by HPTLC technique

    Institute of Scientific and Technical Information of China (English)

    Subramanian Sampathkumar; Ramakrishnan N

    2011-01-01

    Objective:To establish the fingerprint profile of Naringi crenulata (N. crenulata) (Roxb.) Nicols. using high performance thin layer chromatography (HPTLC) technique. Methods: Preliminary phytochemical screening was done and HPTLC studies were carried out. CAMAG HPTLC system equipped with Linomat V applicator, TLC scanner 3, Reprostar 3 and WIN CATS-4 software was used. Results: The results of preliminary phytochemical studies confirmed the presence of protein, lipid, carbohydrate, reducing sugar, phenol, tannin, flavonoid, saponin, triterpenoid, alkaloid, anthraquinone and quinone. HPTLC finger printing of ethanolic extract of stem revealed 10 spots with Rf values in the range of 0.08 to 0.65;bark showed 8 peaks with Rf values in the range of 0.07 to 0.63 and the ethanol extract of leaf revealed 8 peaks with Rf values in the range of 0.09 to 0.49, respectively. The purity of sample was confirmed by comparing the absorption spectra at start, middle and end position of the band. Conclusions:It can be concluded that HPTLC finger printing of N. crenulata may be useful in differentiating the species from the adulterant and act as a biochemical marker for this medicinally important plant in the pharmaceutical industry and plant systematic studies.

  4. Comparative Analysis of Automatic Vehicle Classification Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Kanwal Yousaf

    2012-09-01

    Full Text Available Vehicle classification has emerged as a significant field of study because of its importance in variety of applications like surveillance, security system, traffic congestion avoidance and accidents prevention etc. So far numerous algorithms have been implemented for classifying vehicle. Each algorithm follows different procedures for detecting vehicles from videos. By evaluating some of the commonly used techniques we highlighted most beneficial methodology for classifying vehicles. In this paper we pointed out the working of several video based vehicle classification algorithms and compare these algorithms on the basis of different performance metrics such as classifiers, classification methodology or principles and vehicle detection ratio etc. After comparing these parameters we concluded that Hybrid Dynamic Bayesian Network (HDBN Classification algorithm is far better than the other algorithms due to its nature of estimating the simplest features of vehicles from different videos. HDBN detects vehicles by following important stages of feature extraction, selection and classification. It extracts the rear view information of vehicles rather than other information such as distance between the wheels and height of wheel etc.

  5. Analysis of a digital technique for frequency transposition of speech

    Science.gov (United States)

    Digirolamo, V.

    1985-09-01

    Frequency transposition is the process of raising or lowering the frequency content (pitch) of an audio signal. The hearing impaired community has the greatest interest in the applications of frequency transposing. Though several analog and digital frequency transposing hearing aid systems have been built and tested, this thesis investigates a possible digital processing alternative. Pole shifting, in the z-domain, of an autoregressive (all pole) model of speech was proven to be a viable theory for changing frequency content. Since linear predictive coding (LPC) techniques are used to code, analyze and synthesize speech, with the resulting LPC coefficients related to the coefficients of an equivalent autoregressive model, a linear relationship between LPC coefficients and frequency tranposition is explored. This theoretical relationship is first established using a pure sine wave and then is extended into processing speech. The resulting speech synthesis experiments failed to substantiate the conjectures of this thesis. However, future research avenues are suggested that may lead toward a viable approach to transpose speech.

  6. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1992-01-01

    This report covers the last quarter of the last year of the three-year grant period. In the final project year, we concentrated on the pyrolysis and oxidative pyrolysis of large hydrocarbons and mixtures of large and small hydrocarbons in order to develop the VUV-MS technique for compounds more representative of those in coal pyrolysis applications. Special focus was directed at the pyrolysis and oxidative pyrolysis of benzene and benzene acetylene mixtures. The acetylene/benzene mixtures were used to gain a better understanding of the mechanisms of molecular growth in such systems specifically to look at the kinetics of aryl-aryl reactions as opposed to small molecule addition to phenyl radicals. Sarofim and coworkers at MIT have recently demonstrated the importance of these reactions in coal processing environments. In the past, the growth mechanism for the formation of midsized PAH has been postulated to involve primarily successive acetylene additions to phenyl-type radicals, our work confmns this as an important mechanism especially for smaller PAH but also investigates conditions where biaryl formation can play an important role in higher hydrocarbon formation.

  7. Development of reconstitution technique of irradiated specimen. Progress report for FY1993 on cooperated research between JAERI and IHI

    International Nuclear Information System (INIS)

    Regulatory codes require the surveillance test to evaluate the irradiation embrittlement of reactor pressure vessel steel during operation. However, it is anticipated that the number of those specimens is insufficient in case plant life is extended. Reconstitution techniques by electron beam weld, laser weld, arc stud weld as well as surface-activated joining (SAJ) have been investigated for the reuse of undeformed parts from tested Charpy impact specimen. The important items for the reconstitution technique are to reduce the width of heat affected zone to maximize the material available, and to lower the maximum temperature of specimen during joining process to preclude the recovery of radiation damage. SAJ can be achieved from a removal of surface contamination by rotating one-side specimen in vacuum with applying modest friction force. Therefore, SAJ method is expected to be suitable for specimen reconstitution in view of material heating and melting. This paper describes preliminary study to develop Charpy specimen reconstitution technique using reactor pressure vessel steel, A533B-1, by SAJ method. Test results showed that the SAJ method had a capability of joining affected zone less than 1.5 mm in half width, and over-temperature region less than 3 mm in half width above reactor operating temperature during joining. It was also found that transition temperature from reconstituted Charpy specimen could be evaluated. It can be concluded from these results that SAJ method is attractive technique for reconstituting the irradiated surveillance specimen. (author)

  8. Radiative neutron capture as a counting technique at pulsed spallation neutron sources: a review of current progress

    Science.gov (United States)

    Schooneveld, E. M.; Pietropaolo, A.; Andreani, C.; Perelli Cippo, E.; Rhodes, N. J.; Senesi, R.; Tardocchi, M.; Gorini, G.

    2016-09-01

    Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources.

  9. A Psychophysiological Comparison of the Effects of Three Relaxation Techniques: Respiratory Manipulation Training, Progressive Muscle Relaxation, and Pleasant Imagery.

    Science.gov (United States)

    Longo, David J.

    A within-subjects, three condition design was employed to examine the effects of three relaxation techniques on blood pressures, pulse rates, and self-report measures of relaxation for 12 college students. Respiratory Manipulation Training incorporated instructions to exhale and not to inhale for as long as possible. When breathing could no longer…

  10. Radiative neutron capture as a counting technique at pulsed spallation neutron sources: a review of current progress.

    Science.gov (United States)

    Schooneveld, E M; Pietropaolo, A; Andreani, C; Perelli Cippo, E; Rhodes, N J; Senesi, R; Tardocchi, M; Gorini, G

    2016-09-01

    Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources. PMID:27502571

  11. Data mining techniques for performance analysis of onshore wind farms

    International Nuclear Information System (INIS)

    Highlights: • Indicators are formulated for monitoring quality of wind turbines performances. • State dynamics is processed for formulation of two Malfunctioning Indexes. • Power curve analysis is revisited. • A novel definition of polar efficiency is formulated and its consistency is checked. • Mechanical effects of wakes are analyzed as nacelle stationarity and misalignment. - Abstract: Wind turbines are an energy conversion system having a low density on the territory, and therefore needing accurate condition monitoring in the operative phase. Supervisory Control And Data Acquisition (SCADA) control systems have become ubiquitous in wind energy technology and they pose the challenge of extracting from them simple and explanatory information on goodness of operation and performance. In the present work, post processing methods are applied on the SCADA measurements of two onshore wind farms sited in southern Italy. Innovative and meaningful indicators of goodness of performance are formulated. The philosophy is a climax in the granularity of the analysis: first, Malfunctioning Indexes are proposed, which quantify goodness of merely operational behavior of the machine, irrespective of the quality of output. Subsequently the focus is shifted to the analysis of the farms in the productive phase: dependency of farm efficiency on wind direction is investigated through the polar plot, which is revisited in a novel way in order to make it consistent for onshore wind farms. Finally, the inability of the nacelle to optimally follow meandering wind due to wakes is analysed through a Stationarity Index and a Misalignment Index, which are shown to capture the relation between mechanical behavior of the turbine and degradation of the power output

  12. Technique for particle flow measuring in activation analysis

    International Nuclear Information System (INIS)

    The invention is refered to the methods of measuring particle flow in nuclear-physical methods of substance composition monitoring. The purpose of the invention is to simplify the process of particle flux measurement and improve the accuracy of analysis. To do this, ''clean'' foil is located behind the monitor, when irradiating ''thin'' monitor, located in front of the sample, and measuring induced radioactivity of radionuclide, produced from the monitor basic element. The value oa particle flow is assessed according to activity of radionuclide nuclei, introduced from monitor into the foil, due to nuclear transformation energy. Monitor thickness should exceed the maximal path of radionuclide nuclei in monitor substance. 1 fig

  13. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, es-pecially in view of the enormous amount of information available in computer-based supervision systems......-tions. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engi-neer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described...

  14. Digital radiographic techniques in the analysis of paintings

    International Nuclear Information System (INIS)

    In this chapter the authors use the term digital radiography to mean any method of radiographic image production in which the silver halide-based film is replaced by an electronic sensor for production of an image. There are essentially three types of digital radiographic systems available at present, but others will be developed. These differ primarily in the method of image production and the rapidity with which images can be produced. The three methods discussed are digital fluoroscopy, scanned projection radiography, and the scanned point source radiography. Each has certain characteristics which, if properly utilized, will allow improved x-ray analysis of paintings

  15. Preliminary Analysis of ULPC Light Curves Using Fourier Decomposition Technique

    CERN Document Server

    Ngeow, Chow-Choong; Kanbur, Shashi; Barrett, Brittany; Lin, Bin

    2013-01-01

    Recent work on Ultra Long Period Cepheids (ULPCs) has suggested their usefulness as a distance indicator, but has not commented on their relationship as compared with other types of variable stars. In this work, we use Fourier analysis to quantify the structure of ULPC light curves and compare them to Classical Cepheids and Mira variables. Our preliminary results suggest that the low order Fourier parameters of ULPCs show a continuous trend defined by Classical Cepheids after the resonance around 10 days. However their Fourier parameters also overlapped with those from Miras, which make the classification of long period variable stars difficult based on the light curves information alone.

  16. HPLC-MS technique for radiopharmaceuticals analysis and quality control

    Science.gov (United States)

    Macášek, F.; Búriová, E.; Brúder, P.; Vera-Ruiz, H.

    2003-01-01

    Potentialities of liquid chromatography with mass spectrometric detector (MSD) were investigated with the objective of quality control of radiopharmaceuticals; 2-deoxy-2-[18F]fluoro-D-glucose (FDG) being an example. Screening of suitable MSD analytical lines is presented. Mass-spectrometric monitoring of acetonitrile— aqueous ammonium formate eluant by negatively charged FDG.HCO2 - ions enables isotope analysis (specific activity) of the radiopharmaceutical at m/z 227 and 226. Kryptofix® 222 provides an intense MSD signal of the positive ion associated with NH4 + at m/z 394. Expired FDG injection samples contain decomposition products from which at least one labelled by 18F and characterised by signal of negative ions at m/z 207 does not correspond to FDG fragments but to C5 decomposition products. A glucose chromatographic peak, characterised by m/z 225 negative ion is accompanied by a tail of a component giving a signal of m/z 227, which can belong to [18O]glucose; isobaric sorbitol signals were excluded but FDG-glucose association occurs in the co-elution of separation of model mixtures. The latter can actually lead to a convoluted chromatographic peak, but the absence of 18F makes this inconsistent. Quantification and validation of the FDG component analysis is under way.

  17. Analysis of compressive fracture in rock using statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  18. Study on data analysis techniques in gravitational wave detectors

    International Nuclear Information System (INIS)

    This work initially investigates the possibility of the use of an innovative time-frequency transform, known as S transform, for the data analysis of the gravitational wave detector ALLEGRO. It is verified that its utility for this kind of detector is limited due to the detectors narrow bandwidth. However, it is argued that the S transform may be useful for interferometric detectors. Then a robust data analysis method is presented based on a hypothesis test known as Neyman-Pearson criteria, which allows the determination of candidate burst events. The method consists in the construction of probability distribution functions for the weighted average energy of the data blocks registered by the detector, both in the case of absence of noise and the case of signal mixed with noise. Based on these distributions it is possible to determine the probability that the data block in which a candidate event is present does not coincide with a noise block. This way of searching candidate signals immersed in noise agrees with another method present in the literature. One concludes that this is a promising method since it does not demand the use of a more refined search for candidate events, thus reducing computational processing time. (author)

  19. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.;

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...... analysis (DA) are applied to the evaluation matrix data set obtained by simulation of several control strategies applied to the plant-wide IWA Benchmark Simulation Model No 2 (BSM2). These techniques allow i) to determine natural groups or clusters of control strategies with a similar behaviour, ii......) to find and interpret hidden, complex and casual relation features in the data set and iii) to identify important discriminant variables within the groups found by the cluster analysis. This study illustrates the usefulness of multivariable statistical techniques for both analysis and interpretation...

  20. Structural analysis of irradiated crotoxin by spectroscopic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina C. de; Fucase, Tamara M.; Silva, Ed Carlos S. e; Chagas, Bruno B.; Buchi, Alisson T.; Viala, Vincent L.; Spencer, Patrick J.; Nascimento, Nanci do, E-mail: kcorleto@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Biotecnologia

    2013-07-01

    Snake bites are a serious public health problem, especially in subtropical countries. In Brazil, the serum, the only effective treatment in case of snake bites, is produced in horses which, despite of their large size, have a reduced lifespan due to the high toxicity of the antigen. Ionizing radiation has been successfully employed to attenuate the biological activity of animal toxins. Crotoxin, the main toxic compound from Crotalus durissus terrificus (Cdt), is a heterodimeric protein composed of two subunits: crotapotin and phospholipase A{sub 2}. Previous data indicated that this protein, following irradiation process, undergoes unfolding and/or aggregation, resulting in a much lower toxic antigen. The exact mechanisms and structural modifications involved in aggregation process are not clear yet. This work investigates the effects of ionizing radiation on crotoxin employing Infrared Spectroscopy, Circular Dichroism and Dynamic Light Scattering techniques. The infrared spectrum of lyophilized crotoxin showed peaks corresponding to the vibrational spectra of the secondary structure of crotoxin, including β-sheet, random coil, α-helix and β-turns. We calculated the area of these spectral regions after adjusting for baseline and normalization using the amide I band (1590-1700 cm{sup -1}), obtaining the variation of secondary structures of the toxin following irradiation. The Circular Dichroism spectra of native and irradiated crotoxin suggests a conformational change within the molecule after the irradiation process. This data indicates structural changes between the samples, apparently from ordered conformation towards a random coil. The analyses by light scattering indicated that the irradiated crotoxin formed multimers with an average molecular radius 100 folds higher than the native toxin. (author)

  1. Impact of HIV type 1 DNA levels on spontaneous disease progression: a meta-analysis.

    Science.gov (United States)

    Tsiara, Chrissa G; Nikolopoulos, Georgios K; Bagos, Pantelis G; Goujard, Cecile; Katzenstein, Terese L; Minga, Albert K; Rouzioux, Christine; Hatzakis, Angelos

    2012-04-01

    Several studies have reported the prognostic strength of HIV-1 DNA with variable results however. The aims of the current study were to estimate more accurately the ability of HIV-1 DNA to predict progression of HIV-1 disease toward acquired immunodeficiency syndrome (AIDS) or death, and to compare the prognostic information obtained by HIV-1 DNA with that derived from plasma HIV-1 RNA. Eligible articles were identified through a comprehensive search of Medline, ISI Web of Science, Scopus, and Google Scholar. The analysis included univariate and bivariate random-effects models. The univariate meta-analysis of six studies involving 1074 participants showed that HIV-1 DNA was a strong predictive marker of AIDS [relative risk (RR): 3.01, 95% confidence interval (CI): 1.88-4.82] and of all-cause mortality (RR: 3.49, 95% CI: 2.06-5.89). The bivariate model using the crude estimates of primary studies indicated that HIV-1 DNA was a significantly better predictor than HIV-1 RNA of either AIDS alone (ratio of RRs=1.47, 95% CI: 1.05-2.07) or of combined (AIDS or death) progression outcomes (ratio of RRs=1.51, 95% CI: 1.11-2.05). HIV-1 DNA is a strong predictor of HIV-1 disease progression. Moreover, there is some evidence that HIV-1 DNA might have better predictive value than plasma HIV-1 RNA.

  2. New spectroscopic techniques for wine analysis and characterization

    International Nuclear Information System (INIS)

    The objective of the presented thesis was the development of new, rapid tools for wine analysis based on Fourier transform infrared (FTIR) and Ultraviolet/Visible (UV/Vis) - spectroscopy. The results of this thesis are presented in the form of five publications. In publication I a sensor for assessing the main sensory property of red wine polyphenols (tannins), namely astringency, was developed on basis of the underlying chemical reaction between the tannins and the proline-rich proteins in the saliva. The interaction of polyphenols (tannins) with proline rich proteins (gelatin) has been studied using an automated flow injection system with FTIR detection. In Publication II FTIR-spectroscopy of polyphenolic wine extracts combined with multivariate data analysis was applied for the varietal discrimination of Austrian red wines. By hierarchical clustering it could be shown that the mid-infrared spectra of the dry extracts contain information on the varietal origin of wines. The classification of the wines was successfully performed by soft independent modeling of class analogies (SIMCA). Publication III describes the determination of carbohydrates, alcohols and organic acids in red wine by Ion-exchange high performance liquid chromatography hyphenated with FTIR-detection, where a diamond attenuated total reflectance (ATR)-element was employed for the design of a rugged detector. Partly or completely co-eluting peaks were chemometrically resolved by multivariate curve resolution - alternating least squares (MCR-ALS). Publication IV reports the first application of a mid-infrared quantum cascade laser (QCL) for molecular specific laser detection in liquid chromatography. Using a laser wavelength of 9.3721 μm glucose and fructose could be specifically detected and quantified in red wine in spite of the presence of organic acids. Publication V presents the development of an automated method for measuring the primary amino acid concentration in wines and musts by

  3. Multidimensional Analysis of Quenching: Comparison of Inverse Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, K.J.

    1998-11-18

    Understanding the surface heat transfer during quenching can be beneficial. Analysis to estimate the surface heat transfer from internal temperature measurements is referred to as the inverse heat conduction problem (IHCP). Function specification and gradient adjoint methods, which use a gradient search method coupled with an adjoint operator, are widely u led methods to solve the IHCP. In this paper the two methods are presented for the multidimensional case. The focus is not a rigorous comparison of numerical results. Instead after formulating the multidimensional solutions, issues associated with the numerical implementation and practical application of the methods are discussed. In addition, an experiment that measured the surface heat flux and temperatures for a transient experimental case is analyzed. Transient temperatures are used to estimate the surface heat flux, which is compared to the measured values. The estimated surface fluxes are comparable for the two methods.

  4. Analysis of myocardial infarction signals using optical technique.

    Science.gov (United States)

    Mahri, Nurhafizah; Gan, Kok Beng; Mohd Ali, Mohd Alauddin; Jaafar, Mohd Hasni; Meswari, Rusna

    2016-01-01

    The risk of heart attack or myocardial infarction (MI) may lead to serious consequences in mortality and morbidity. Current MI management in the triage includes non-invasive heart monitoring using an electrocardiogram (ECG) and the cardic biomarker test. This study is designed to explore the potential of photoplethysmography (PPG) as a simple non-invasive device as an alternative method to screen the MI subjects. This study emphasises the usage of second derivative photoplethysmography (SDPPG) intervals as the extracted features to classify the MI subjects. The statistical analysis shows the potential of "a-c" interval and the corrected "a-cC" interval to classify the subject. The sensitivity of the predicted model using "a-c" and "a-cC" is 90.6% and 81.2% and the specificity is 87.5% and 84.4%, respectively. PMID:27010162

  5. A Generalized Lanczos-QR Technique for Structural Analysis

    DEFF Research Database (Denmark)

    Vissing, S.

    systems with very special properties. Due to the finite discretization the matrices are sparse and a relatively large number of problems also has real and symmetric matrices. The matrix equation for an undamped vibration contains two matrices describing tangent stiffness and mass distributions....... Alternatively, in a stability analysis, tangent stiffness and geometric stiffness matrices are introduced into an eigenvalue problem used to determine possible bifurcation points. The common basis for these types of problems is that the matrix equation describing the problem contains two real, symmetric......Within the field of solid mechanics such as structural dynamics and linearized as well as non-linear stability, the eigenvalue problem plays an important role. In the class of finite element and finite difference discretized problems these engineering problems are characterized by large matrix...

  6. Denial of Service Attack Techniques: Analysis, Implementation and Comparison

    Directory of Open Access Journals (Sweden)

    Khaled Elleithy

    2005-02-01

    Full Text Available A denial of service attack (DOS is any type of attack on a networking structure to disable a server from servicing its clients. Attacks range from sending millions of requests to a server in an attempt to slow it down, flooding a server with large packets of invalid data, to sending requests with an invalid or spoofed IP address. In this paper we show the implementation and analysis of three main types of attack: Ping of Death, TCP SYN Flood, and Distributed DOS. The Ping of Death attack will be simulated against a Microsoft Windows 95 computer. The TCP SYN Flood attack will be simulated against a Microsoft Windows 2000 IIS FTP Server. Distributed DOS will be demonstrated by simulating a distribution zombie program that will carry the Ping of Death attack. This paper will demonstrate the potential damage from DOS attacks and analyze the ramifications of the damage.

  7. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  8. Compartmental analysis, imaging techniques and population pharmacokinetic. Experiences at CENTIS

    International Nuclear Information System (INIS)

    Introduction: In pharmacokinetic evaluation small rodents are used in a large extend. Traditional pharmacokinetic evaluations by the two steps approach can be replaced by the sparse data design which may also represent a complicated situation to evaluate satisfactorily from the statistical point of view. In this presentation different situations of sparse data sampling are analyzed based on practical consideration. Non linear mixed effect model was selected in order to estimate pharmacokinetic parameters in simulated data from real experimental results using blood sampling and imaging procedures. Materials and methods: Different scenarios representing several experimental designs of incomplete individual profiles were evaluated. Data sets were simulated based on real data from previous experiments. In all cases three to five blood samples were considered per time point. A combination of compartmental analysis with tumor uptake obtained by gammagraphy of radiolabeled drugs is also evaluated.All pharmacokinetic profiles were analyzed by means of MONOLIX software version 4.2.3. Results: All sampling schedules yield the same results when computed using the MONOLIX software and the SAEM algorithm. Population and individual pharmacokinetic parameters were accurately estimated with three or five determination per sampling point. According with the used methodology and software tool, it can be an expected result, but demonstrating the method performance in such situations, allow us to select a more flexible design using a very small number of animals in preclinical research. The combination with imaging procedures also allows us to construct a completely structured compartmental analysis. Results of real experiments are presented demonstrating the versatility of used methodology in different evaluations. The same sampling approach can be considered in phase I or II clinical trials. (author)

  9. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  10. Progressive failure analysis of slope with strain-softening behaviour based on strength reduction method

    Institute of Scientific and Technical Information of China (English)

    Ke ZHANG; Ping CAO; Rui BAO

    2013-01-01

    Based on the strength reduction method and strain-softening model,a method for progressive failure analysis of strain-softening slopes was presented in this paper.The mutation is more pronounced in strain-softening analysis,and the mutation of displacement at slope crest was taken as critical failure criterion.An engineering example was provided to demonstrate the validity of the present method.This method was applied to a cut slope in an industry site.The results are as follows: (1) The factor of safety and the critical slip surface obtained by the present method are between those by peak and residual strength.The analysis with peak strength would lead to non-conservative results,but that with residual strength tends to be overly conservative.(2) The thickness of the shear zone considering strain-softening behaviour is narrower than that with non-softening analysis.(3) The failure of slope is the process of the initiation,propagation and connection of potential failure surface.The strength parameters are mobilized to a non-uniform degree while progressive failure occurs in the slope.(4) The factor of safety increases with the increase of residual shear strain threshold and elastic modulus.The failure mode of slope changes from shallow slip to deep slip.Poisson's ratio and dilation angle have little effect on the results.

  11. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  12. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    International Nuclear Information System (INIS)

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system

  13. Examination, characterisation and analysis techniques for the knowledge and the conservation / restoration of cultural heritage - importance of ionising radiation techniques

    International Nuclear Information System (INIS)

    For the examination, characterisation and analysis of cultural heritage artefacts or art objects and their component materials, the conservation scientist needs a palette of non destructive and non invasive techniques, in order to improve our knowledge concerning their elaboration, their evolution and/or degradation during time, and to give rational basis for their restoration and conservation. A general survey and illustrations showing the usefulness of these techniques will be presented. Among these methods, many are based on the use of ionising radiation. 1. Radiography (using X-rays, gamma rays, beta particles, secondary electrons, neutrons), electron emission radiography, tomodensimetry, 2. Scanning electron microscope associated with X-ray spectrometry, 3. X-ray diffraction, 4. Synchrotron radiation characterisation, 5. X-ray fluorescence analysis, 6. Activation analysis, 7. Ion beam analysis (PIXE, PIGE, RBS, secondary X-ray fluorescence), 8. Thermoluminescence dating, 9. Carbon-14 dating. These methods are used alone or in connection with other analytical methods. Any kind of materials can be encountered, for instance: i. stones, gems, ceramics, terracotta, enamels, glasses, i i. wood, paper, textile, bone, ivory, i i i. metals, jewellery, i v. paint layers, canvas and wooden backings, pigments, dyers, oils, binding media, varnishes, glues. Some examples will be taken, among recent work done at the Centre of Research and Restoration of the Museums of France (C2RMF), from various geographical origins, various ages and different art disciplines. This will illustrate the kind of assistance that science and technology can provide to a better knowledge of mankind's cultural heritage and also to the establishment of rational basis for its better conservation for the future generations. (Author)

  14. New techniques for analysis of organic pollutants in drinking water

    Energy Technology Data Exchange (ETDEWEB)

    Kissinger, L.D.

    1979-01-01

    An abstractor packing prepared by coating Chromosorb G AW/DMCS with copper(II) chloride was effective for removal of amines from gas-chromatographic streams, but it did not affect the chromatographic behavior of nonamine compounds. By using pre-columns packed with the abstractor packing, solventless chromatograms were obtained for samples in pyridine. A method was developed for determining haloforms in drinking water by sorption of the haloforms on columns packed with acetylated XAD-2. A pre-column of the abstractor packing was used to remove the pyridine solvent from the samples containing the haloforms concentrated from waters. Detection limits for the four chloro-, bromo- haloforms in a 100-ml water sample using an electron capture detector were below 1 ppB. Addition of ascorbic acid to chlorinated waters was effective for stopping the production of haloforms. Design of the inlet allowed samples to be introduced to the capillary column in a Tracor model 550 gas chromatograph with or without splitting of the carrier-gas stream. An exit splitter was implemented that carried the effluent from the capillary column to two detectors. The capillary-column system was applied to the analysis of trace components in complex mixtures. Small columns packed with Florisil were used to fractionate mixtures of organic compounds by gravity-flow liquid chromatography. Three fractions of organic compounds were collected from the Florisil columns. The recovery and elution behavior of many organic compounds was investigated. Organic compounds from fifteen waters were fractionated on Florisil.

  15. Development of dispersive mode analysis technique for bent pipes

    International Nuclear Information System (INIS)

    In this study, detection of flaws in bent feeder pipes using ultrasonic guided wave was investigated using 3-D FEM, and 2-D FFT were specifically 1) the transient responses of the bent pipe were calculated by using a general-purpose finite element program, 2) the displacements were extracted at a series of sequential points as a function of spatial position and time, u(x,t), 3) and then 2-D FFT of u(x,t) was performed to determine U(k,w). From this relationship between the wave number(k) and angular frequency(w), the phase velocity and group velocity were calculated. In addition, accuracy of this method was verified by comparison of predicted modes for straight pipe with theoretical solution. Furthermore, verification of the result is made by the mode identification using wavelet transform. The modes invoked by both methods agree very well. In addition, the ultrasonic guide wave inspection of bent pipes was also investigated. In case of the longitudinal modes analysis for the bent pipe without crack, generated modes were similar with straight pipe. However, in case of the bent pipe with a crack, the received signals were changed at the end of the pipe due to the flaw presence. Thus, we could determine whether flaw exist in the feeder pipe or not by using torsional mode guide wave

  16. Cinematographical analysis of javelin throwing techniques of decathletes.

    Science.gov (United States)

    Kunz, H.; Kaufmann, D. A.

    1983-01-01

    Th purpose of this study was to analyse by correlational methods the biomechanical factors involved in achieving the maximal distance thrown in the javelin event. Twelve Swiss decathletes and two world class javelin specialists were filmed by a high speed (102 fps) 16 mm camera throwing a total of 20 trials. The co-ordinates of the resulting cyclograms were processed by a computer programme and the results submitted to correlational analysis. The highest correlation was 0.76 between velocity at release and distance thrown. Other negative correlations were found between distance thrown and angle of the javelin with the horizontal (0.52) and distance thrown and throwing hand to contralateral foot distance during the last strides (0.67). Javelin specialists, who had longer throws than decathletes (mean = 79.03 m versus 54.29 m), had a smaller difference between the angle of attack and angle of release. The results suggest that in order to attain maximal distance thrown the javelin thrower should achieve positive acceleration during the running approach, effective thrusting with th right leg on the penultimate stride and carry the javelin during the last strides at the optimal angle of release (32 to 36 degrees). Images p200-a p200-b PMID:6652405

  17. Elastic/Plastic Drop Analysis Using Finite Element Techniques

    International Nuclear Information System (INIS)

    A Spent Nuclear Fuel (SNF) can, which is called the High Integrity Can (HIC), is being designed at the Idaho National Engineering and Environmental Laboratory (INEEL). Its intended use is to contain SNF sent to the Idaho Nuclear Technology and Engineering Center (INTEC). INTEC will then do the final work with the HIC to send it to the repository at Yucca Mountain, Nevada, for long-term storage. One portion of the analysis, which was required for the HIC, was accidental drop scenarios. This consisted of 19 simulated drops from a height of 30-feet with impact on a flat rigid surface. Elastic/plastic analyses were performed for the simulated drops. Additionally, two elastic/plastic analyses were performed for drops from a height of 17-feet with impact on a rigid surface having a narrow raised portion across its center. The purpose of the analyses was to determine if any breach occurred which opened a crack wider than 0.05-inches from these drop scenarios. Also some plastic deformations were needed from certain drop scenarios to support the Criticality Safety documentation. The analytical results for the simulated drop scenarios showed that, though the seal in the lid may be broken, no 0.05-inch breach occurred. Also, the deformations for Criticality Safety documentation were calculated and show on the applicable output plots

  18. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    Science.gov (United States)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  19. Theoretical analysis of highly linear tunable filters using Switched-Resistor techniques

    NARCIS (Netherlands)

    Jiraseree-amornkun, Amorn; Worapishet, Apisak; Klumperink, Eric A.M.; Nauta, Bram; Surakampontorn, Wanlop

    2008-01-01

    In this paper, an in-depth analysis of switched-resistor (S-R) techniques for implementing low-voltage low-distortion tunable active-RC filters is presented. The S-R techniques make use of switch(es) with duty-cycle-controlled clock(s) to achieve tunability of the effective resistance and, hence, th

  20. Application of optimal estimation techniques to FFTF decay heat removal analysis

    International Nuclear Information System (INIS)

    The verification and adjustment of plant models for decay heat removal analysis using a mix of engineering judgment and formal techniques from control theory are discussed. The formal techniques facilitate dealing with typical test data which are noisy, redundant and do not measure all of the plant model state variables directly. Two pretest examples are presented. 5 refs

  1. Cross Validation and Discriminative Analysis Techniques in a College Student Attrition Application.

    Science.gov (United States)

    Smith, Alan D.

    1982-01-01

    Used a current attrition study to show the usefulness of discriminative analysis and a cross validation technique applied to student nonpersister questionnaire respondents and nonrespondents. Results of the techniques allowed delineation of several areas of sample under-representation and established the instability of the regression weights…

  2. Simultaneous multielement analysis of rock samples by inductively coupled plasma mass spectrometry using discrete microsampling technique

    International Nuclear Information System (INIS)

    Simultaneous multielement analysis of geological standard rock samples (JG-1 and JB-2) has been successfully performed by inductively coupled plasma mass spectrometry using a discrete microsampling technique. In this technique only 100 μl sample solution was used for simultaneous determination of 5-10 elements in solution. (author)

  3. Analysis of Parametric & Non Parametric Classifiers for Classification Technique using WEKA

    Directory of Open Access Journals (Sweden)

    Yugal kumar

    2012-07-01

    Full Text Available In the field of Machine learning & Data Mining, lot of work had been done to construct new classification techniques/ classifiers and lot of research is going on to construct further new classifiers with the help of nature inspired technique such as Genetic Algorithm, Ant Colony Optimization, Bee Colony Optimization, Neural Network, Particle Swarm Optimization etc. Many researchers provided comparative study/ analysis of classification techniques. But this paper deals with another form of analysis of classification techniques i.e. parametric and non parametric classifiers analysis. This paper identifies parametric & non parametric classifiers that are used in classification process and provides tree representation of these classifiers. For the analysis purpose, four classifiers are used in which two of them are parametric and rest of are non-parametric in nature.

  4. A comparative study on change vector analysis based change detection techniques

    Indian Academy of Sciences (India)

    Sartajvir Singh; Rajneesh Talwar

    2014-12-01

    Detection of Earth surface changes are essential to monitor regional climatic, snow avalanche hazard analysis and energy balance studies that occur due to air temperature irregularities. Geographic Information System (GIS) enables such research activities to be carried out through change detection analysis. From this viewpoint, different change detection algorithms have been developed for land-use land-cover (LULC) region. Among the different change detection algorithms, change vector analysis (CVA) has level headed capability of extracting maximuminformation in terms of overall magnitude of change and the direction of change between multispectral bands from multi-temporal satellite data sets. Since past two–three decades, many effective CVA based change detection techniques e.g., improved change vector analysis (ICVA), modified change vector analysis (MCVA) and change vector analysis posterior-probability space (CVAPS), have been developed to overcome the difficulty that exists in traditional change vector analysis (CVA). Moreover, many integrated techniques such as cross correlogram spectral matching (CCSM) based CVA. CVA uses enhanced principal component analysis (PCA) and inverse triangular (IT) function, hyper-spherical direction cosine (HSDC), and median CVA (m-CVA), as an effective LULC change detection tools. This paper comprises a comparative analysis on CVA based change detection techniques such as CVA, MCVA, ICVA and CVAPS. This paper also summarizes the necessary integrated CVA techniques along with their characteristics, features and shortcomings. Based on experiment outcomes, it has been evaluated that CVAPS technique has greater potential than other CVA techniques to evaluate the overall transformed information over three differentMODerate resolution Imaging Spectroradiometer (MODIS) satellite data sets of different regions. Results of this study are expected to be potentially useful for more accurate analysis of LULC changes which will, in turn

  5. DESIGN & ANALYSIS TOOLS AND TECHNIQUES FOR AEROSPACE STRUCTURES IN A 3D VISUAL ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Radu BISCA

    2009-09-01

    Full Text Available The main objective of this project is to develop a set of tools and to integrate techniques in a software package which is build on structure analysis applications based on Romanian engineers experience in designing and analysing aerospace structures, consolidated with the most recent methods and techniques. The applications automates the structure’s design and analysis processes and facilitate the exchange of technical information between the partners involved in a complex aerospace project without limiting the domain.

  6. Using Link Analysis Technique with a Modified Shortest-Path Algorithm to Fight Money Laundering

    Institute of Scientific and Technical Information of China (English)

    CHEN Yunkai; MAI Quanwen; LU Zhengding

    2006-01-01

    Effective link analysis techniques are needed to help law enforcement and intelligence agencies fight money laundering.This paper presents a link analysis technique that uses a modified shortest-path algorithms to identify the strongest association paths between entities in a money laundering network.Based on two-tree Dijkstra and Priority-First-Search (PFS) algorithm, a modified algorithm is presented.To apply the algorithm, a network representation transformation is made first.

  7. The Delphi Technique in nursing research - Part 3: Data Analysis and Reporting

    OpenAIRE

    Dimitrios Kosmidis; Sotiria Koutsouki; Dimitrios Theofanidis

    2013-01-01

    The Delphi technique is a research method with a multitude of literature regarding its application, yet there is limited guidance on methods of analysis and the presentation of results. Aim: To describe and critically analyze the main methods of the qualitative and quantitative data analysis in studies using the Delphi Technique. Materials and methods: The literature search included research and review articles of nursing interest within the following databases: IATROTEK, Medline, Cinahl and ...

  8. Progress as Compositional Lock-Freedom

    DEFF Research Database (Denmark)

    Carbone, Marco; Dardha, Ornela; Montesi, Fabrizio

    2014-01-01

    such definition to capture a more intuitive notion of context adequacy for checking progress. Interestingly, our new catalysers lead to a novel characterisation of progress in terms of the standard notion of lock-freedom. Guided by this discovery, we also develop a conservative extension of catalysers that does...... not depend on types, generalising the notion of progress to untyped session-based processes. We combine our results with existing techniques for lock-freedom, obtaining a new methodology for proving progress. Our methodology captures new processes wrt previous progress analysis based on session types....

  9. Multi-omics Frontiers in Algal Research: Techniques and Progress to Explore Biofuels in the Postgenomics World.

    Science.gov (United States)

    Rai, Vineeta; Karthikaichamy, Anbarasu; Das, Debasish; Noronha, Santosh; Wangikar, Pramod P; Srivastava, Sanjeeva

    2016-07-01

    Current momentum of microalgal research rests extensively in tapping the potential of multi-omics methodologies in regard to sustainable biofuels. Microalgal biomass is fermented to bioethanol; while lipids, particularly triacylglycerides (TAGs), are transesterified to biodiesels. Biodiesel has emerged as an ideal biofuel candidate; hence, its commercialization and use are increasingly being emphasized. Abiotic stresses exaggerate TAG accumulation, but the precise mechanisms are yet to be known. More recently, comprehensive multi-omics studies in microalgae have emerged from the biofuel perspective. Genomics and transcriptomics of microalgae have provided crucial leads and basic understanding toward lipid biosynthesis. Proteomics and metabolomics are now complementing "algal omics" and offer precise functional insights into the attendant static and dynamic physiological contexts. Indeed, the field has progressed from shotgun to targeted approaches. Notably, targeted proteomics studies in microalga are not yet reported. Several multi-omics tools and technologies that may be used to dig deeper into the microalgal physiology are examined and highlighted in this review. The article therefore aims to both introduce various available high-throughput biotechnologies and applications of "omics" in microalgae, and enlists a compendium of the emerging cutting edge literature. We suggest that a strategic and thoughtful combination of data streams from different omics platforms can provide a system-wide overview. The algal omics warrants closer attention in the future, with a view to technical, economic, and societal impacts that are anticipated in the current postgenomics era. PMID:27315140

  10. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  11. Publishing nutrition research: a review of multivariate techniques--part 2: analysis of variance.

    Science.gov (United States)

    Harris, Jeffrey E; Sheean, Patricia M; Gleason, Philip M; Bruemmer, Barbara; Boushey, Carol

    2012-01-01

    This article is the eighth in a series exploring the importance of research design, statistical analysis, and epidemiology in nutrition and dietetics research, and the second in a series focused on multivariate statistical analytical techniques. The purpose of this review is to examine the statistical technique, analysis of variance (ANOVA), from its simplest to multivariate applications. Many dietetics practitioners are familiar with basic ANOVA, but less informed of the multivariate applications such as multiway ANOVA, repeated-measures ANOVA, analysis of covariance, multiple ANOVA, and multiple analysis of covariance. The article addresses all these applications and includes hypothetical and real examples from the field of dietetics.

  12. Progress in study of Prespa Lake using nuclear and related techniques (IAEA Regional Project RER/8/008)

    International Nuclear Information System (INIS)

    One of the main objective of the IAEA - Regional project RER/8/008 entitled Study of Prespa Lake Using Nuclear and Related Techniques was to provide a scientific basis for sustainable and environmental management of the Lake Prespa (Three lakes: Ohrid, Big Prespa and Small Prespa are on the borders between Albania, Republic of Macedonia and Greece, and are separated by the Mali i Thate and Galichica, mostly Carstificated mountains), see Fig. 1. In this sense investigations connected with the hydrogeology, water quality (Physics-chemical, biological and radiological characteristics) and water balance determination by application of Environmental isotopes ( i.e. H,D,T,O-18,O-18 etc.,) distribution, artificial water tracers and other relevant analytical techniques such as: AAS, HPLC, Total α and β-activity, α and γ-spectrometry as well as ultra sonic measurements (defining of the Lake bottom profile) through regional cooperation / Scientists from Albania, Greece and Republic of Macedonia, participated in the implementation of the Project/ during one hydrological year, had been initiated and valuable results obtained, a part of which are presented in this report. This cooperation was the only way for providing necessary data for better understanding beside the other, of the water quality of the Prespa Lake and its hydrological relationship to Ohrid Lake too, representing a unique regional hydro system in the world. (Author)

  13. Research Progress of Freezing Technique in Foods%食品冷冻技术的研究进展

    Institute of Scientific and Technical Information of China (English)

    张钟; 江潮

    2014-01-01

    The development of freezing technique is very rapidly.The using of freeze technique is more and more extensively.The application of the technology in food industry better as a arousing popular interest topic.This article mainly summarize the theoretical research in the course of freezing and the application in food industry.The present situation and development trend are indicated in this paper.%冷冻技术的发展异常迅速,在食品工业中的应用也越来越广泛,将其更好地应用于食品中成为当前研究者较为关注的课题。重点对冷冻过程中的理论研究及其在食品工业中的应用进行综述,并简述了近年来国内外食品冷冻技术的现状与发展趋势。

  14. Progressive Collapse Analysis of Steel Framed Structures with I-Beams and Truss Beams using Linear Static Procedure

    OpenAIRE

    Fadaei, Sepideh

    2012-01-01

    ABSTRACT: Progressive collapse starts with a local damage or loss of some members of the structure leading to failure at large parts of a structure. Due to the recent disastrous events like world Trade Center in USA, taking measures in reducing the potential of progressive collapse (PC) of structures during the analysis and design stages is becoming a necessity for the structures. A number of computational analysis programs, such as ETABS, SAP2000, ABAQUS can be used to simulate the structure...

  15. Pseudo-progression after stereotactic radiotherapy of brain metastases: lesion analysis using MRI cine-loops.

    Science.gov (United States)

    Wiggenraad, Ruud; Bos, Petra; Verbeek-de Kanter, Antoinette; Lycklama À Nijeholt, Geert; van Santvoort, Jan; Taphoorn, Martin; Struikmans, Henk

    2014-09-01

    Stereotactic radiotherapy (SRT) of brain metastasis can lead to lesion growth caused by radiation toxicity. The pathophysiology of this so-called pseudo-progression is poorly understood. The purpose of this study was to evaluate the use of MRI cine-loops for describing the consecutive events in this radiation induced lesion growth. Ten patients were selected from our department's database that had received SRT of brain metastases and had lesion growth caused by pseudo-progression as well as at least five follow-up MRI scans. Pre- and post SRT MRI scans were co-registered and cine-loops were made using post-gadolinium 3D T1 axial slices. The ten cine loops were discussed in a joint meeting of the authors. The use of cine-loops was superior to evaluation of separate MRI scans for interpretation of events after SRT. There was a typical lesion evolution pattern in all patients with varying time course. Initially regression of the metastases was observed, followed by an enlarging area of new contrast enhancement in the surrounding brain tissue. Analysis of consecutive MRI's using cine-loops may improve understanding of pseudo-progression. It probably represents a radiation effect in brain tissue surrounding the irradiated metastasis and not enlargement of the metastasis itself.

  16. Some failure modes and analysis techniques for terrestrial solar cell modules

    Science.gov (United States)

    Shumka, A.; Stern, K. H.

    1978-01-01

    Analysis data are presented on failed/defective silicon solar cell modules of various types and produced by different manufacturers. The failure mode (e.g., internal short and open circuits, output power degradation, isolation resistance degradation, etc.) are discussed in detail and in many cases related to the type of technology used in the manufacture of the modules; wherever applicable, appropriate corrective actions are recommended. Consideration is also given to some failure analysis techniques that are applicable to such modules, including X-ray radiography, capacitance measurement, cell shunt resistance measurement by the shadowing technique, steady-state illumination test station for module performance illumination, laser scanning techniques, and the SEM.

  17. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Matthew W. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include the inherently weak Raman cross section and susceptibility to fluorescence interference.

  18. Simulation and Aerodynamic Analysis of the Flow Around the Sailplane Using CFD Techniques

    Directory of Open Access Journals (Sweden)

    Sebastian Marian ZAHARIA

    2015-12-01

    Full Text Available In this paper, it was described the analysis and simulation process using the CFD technique and the phenomena that shows up in the engineering aero-spatial practice, directing the studies of simulation for the air flows around sailplane. The analysis and aerodynamic simulations using Computational Fluid Dynamics techniques (CFD are well set as instruments in the development process of an aeronautical product. The simulation techniques of fluid flow helps engineers to understand the physical phenomena that take place in the product design since its prototype faze and in the same time allows for the optimization of aeronautical products’ performance concerning certain design criteria.

  19. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  20. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  1. X-CT成像技术进展简述%Progress in X-CT Imaging Technique

    Institute of Scientific and Technical Information of China (English)

    王艳芹; 王秀丽

    2015-01-01

    本文主要介绍了X-CT技术在高速扫描成像、容积扫描成像、功能成像、低剂量扫描成像等方面的应用,阐述了几种成像技术的发展历程,并指出锥形束CT和能谱CT是CT发展的重要方向,具有较好应用前景。%This article mainly introduced the X-ray CT technique in the application of fast scan imaging, volume scan imaging, function imaging and the low dose scan imaging, and expounded the development course of several imaging technologies. Finally this paper points out that cone beam CT and energy spectrum CT are the important direction of CT, which have good application prospects.

  2. Development of synchrotron x-ray micro-spectroscopic techniques and application to problems in low temperature geochemistry. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-10-01

    The focus of the technical development effort has been the development of apparatus and techniques for the utilization of X-ray Fluorescence (XRF), Extended X-ray Absorption Fine Structure (EXAFS) and X-ray Absorption Near Edge Structure (XANES) spectroscopies in a microprobe mode. The present XRM uses white synchrotron radiation (3 to 30 keV) from a bending magnet for trace element analyses using the x-ray fluorescence technique Two significant improvements to this device have been recently implemented. Focusing Mirror: An 8:1 ellipsoidal mirror was installed in the X26A beamline to focus the incident synchrotron radiation and thereby increase the flux on the sample by about a factor of 30. Incident Beam Monochromator: The monochromator has been successfully installed and commissioned in the X26A beamline upstream of the mirror to permit analyses with focused monochromatic radiation. The monochromator consists of a channel-cut silicon (111) crystal driven by a Klinger stepping motor translator. We have demonstrated the operating range of this instrument is 4 and 20 keV with 0.01 eV steps and produces a beam with a {approximately}10{sup {minus}4} energy bandwidth. The primary purpose of the monochromator is for x-ray absorption spectroscopy (XAS) measurements but it is also used for selective excitation in trace element microanalysis. To date, we have conducted XANES studies on Ti, Cr, Fe, Ce and U, spanning the entire accessible energy range and including both K and L edge spectra. Practical detection limits for microXANES are 10--100 ppM for 100 {mu}m spots.

  3. Multivariate analysis of progressive thermal desorption coupled gas chromatography-mass spectrometry.

    Energy Technology Data Exchange (ETDEWEB)

    Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel; Borek, Theodore Thaddeus, III

    2010-09-01

    Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that vary as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief

  4. Message Structures: a modelling technique for information systems analysis and design

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2011-01-01

    Despite the increasing maturity of model-driven software development (MDD), some research challenges remain open in the field of information systems (IS). For instance, there is a need to improve modelling techniques so that they cover several development stages in an integrated way, and they facilitate the transition from analysis to design. This paper presents Message Structures, a technique for the specification of communicative interactions between the IS and organisational actors. This technique can be used both in the analysis stage and in the design stage. During analysis, it allows abstracting from the technology that will support the IS, and to complement business process diagramming techniques with the specification of the communicational needs of the organisation. During design, Message Structures serves two purposes: (i) it allows to systematically derive a specification of the IS memory (e.g. a UML class diagram), (ii) and it allows to reason the user interface design using abstract patterns. Thi...

  5. Insight to Nanoparticle Size Analysis—Novel and Convenient Image Analysis Method Versus Conventional Techniques

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-03-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  6. 500 MW demonstration of advanced wall-fired combustion techniques for the reduction of nitrogen oxide (NO{sub x}) emissions from coal-fired boilers. Technical progress report, second quarter 1994, April 1994--June 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    This quarterly report discusses the technical progress of an Innovative Clean Coal Technology (ICCT) demonstration of advanced wall-fired combustion techniques for the reduction of nitrogen oxide (NOx) emissions from coal-fired boilers. The project is being conducted at Georgia Power Company`s Plant Hammond Unit 4 located near Rome, Georgia. The primary goal of this project is the characterization of the low NOx combustion equipment through the collection and analysis of long-term emissions data. A target of achieving fifty percent NOx reduction using combustion modifications has been established for the project. The project provides a stepwise retrofit of an advanced overfire air (AOFA) system followed by low NOx burners (LNB). During each test phase of the project, diagnostic, performance, long-term, and verification testing will be performed. These tests are used to quantify the NOx reductions of each technology and evaluate the effects of those reductions on other combustion parameters. Results are described.

  7. 硫磺回收工艺技术进展%Progress in the techniques of sulfur recovery processes

    Institute of Scientific and Technical Information of China (English)

    张文革; 黄丽月; 李军

    2011-01-01

    China has actualized more strict control to environment pollution, with the issuance of 《 Integrated emission standard of air pollutants》. And,there are many sulfur recovery plants constructed. By reviewing the present status of domestic sulfur recovery plants, some sulfur recovery techniques used in different industries were introduced and evaluated,Claus process was discussed emphatically.%随着《大气污染物综合排放标准》的发布与实施,中国对环境污染的控制日益严格,相关行业纷纷建设硫磺回收装置.作者结合国内硫磺回收装置的工艺技术和产能现状,介绍了不同行业的硫磺回收工艺技术,着重讨论了克劳斯法硫磺回收工艺技术,为硫磺回收装置的改建和扩建提供了参考.

  8. Auditing Information Structures in Organizations: A Review of Data Collection Techniques for Network Analysis

    NARCIS (Netherlands)

    Zwijze-Koning, Karen H.; Jong, de Menno D.T.

    2005-01-01

    Network analysis is one of the current techniques for investigating organizational communication. Despite the amount of how-to literature about using network analysis to assess information flows and relationships in organizations, little is known about the methodological strengths and weaknesses of

  9. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  10. Teaching Tip: Using Activity Diagrams to Model Systems Analysis Techniques: Teaching What We Preach

    Science.gov (United States)

    Lending, Diane; May, Jeffrey

    2013-01-01

    Activity diagrams are used in Systems Analysis and Design classes as a visual tool to model the business processes of "as-is" and "to-be" systems. This paper presents the idea of using these same activity diagrams in the classroom to model the actual processes (practices and techniques) of Systems Analysis and Design. This tip…

  11. Use of fuzzy techniques for analysis of dynamic loads in power systems

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Presents the use of fuzzy techniques for analysis of dynamic load characteristics of power systems to identify the voltage stability (collapse) of a weak bus and concludes from the consistent results obtained that this is a useful tool for analysis of load charactersitics of sophiscated power systems and their components.

  12. Comparison of various procedures for progressive collapse analysis of cable-stayed bridges

    Institute of Scientific and Technical Information of China (English)

    Jian-guo CAI; Yi-xiang XU; Li-ping ZHUANG; Jian FENG; Jin ZHANG

    2012-01-01

    Alternate path (AP) method is the most widely used method for the progressive collapse analysis,and its application in frame structures has been well proved.However,the application of AP method for other structures,especially for cable-stayed structures,should be further developed.The four analytical procedures,i.e.,linear static,nonlinear static,linear dynamic,and nonlinear dynamic were firstly improved by taking into account the initial state.Then a cable-stayed structure was studied using the four improved methods.Furthermore,the losses of both one cable and two cables were discussed.The results show that for static and dynamic analyses of the cable-stayed bridges,there is large difference between the results obtained from simulations starting with either a deformed or a nondeformed configuration at the time of cable loss.The static results are conservative in the vicinity of the ruptured cable,but the dynamic effect of the cable loss in the area farther away from the loss-cable cannot be considered.Moreover,the dynamic amplification factor of 2.0 is found to be a good estimate for static analysis procedures,since linear static and linear dynamic procedures yield approximately the same maximum vertical deflection.The results of the comprehensive evaluation of the cable failure show that the tread of the progressive failure of the cable-stayed bridges decreases when the location of the failed cables is closer to the pylon.

  13. High beta and second region stability analysis and ICRF edge modeling: Progress report

    International Nuclear Information System (INIS)

    This report describes the tasks accomplished under Department of Energy contract No. FG02-86ER53236 in modeling the edge plasma- antenna interaction that occurs during Ion Cyclotron Range of Frequency (ICRF) heating. This work has resulted in the development of several codes which determine kinetic and fluid modifications to the edge plasma. When used in combination, these codes predict the level of impurity generation observed in experiments on the Princeton Large Torus. In addition, these models suggest improvements to the design of ICRF antennas. Also described is progress made on high beta and second region analysis. Code development for a comprehensive infernal mode analysis code is nearing completion. A method has been developed for parameterizing the second region of stability and is applied to circular cross section tokamaks. Various studies for high beta experimental devices such as PBX-M and DIII-D have been carried out and are reported on. 19 refs., 8 figs., 1 tab

  14. Progressive failure analysis of composite structure based on micro- and macro-mechanics models

    Institute of Scientific and Technical Information of China (English)

    孙志刚; 阮绍明; 陈磊; 宋迎东

    2015-01-01

    Based on parameter design language, a program of progressive failure analysis in composite structures is proposed. In this program, the relationship between macro- and micro-mechanics is established and the macro stress distribution of the composite structure is calculated by commercial finite element software. According to the macro-stress, the damaged point is found and the micro-stress distribution of representative volume element is calculated by finite-volume direct averaging micromechanics (FVDAM). Compared with the results calculated by failure criterion based on macro-stress field (the maximum stress criteria and Hashin criteria) and micro-stress field (Huang model), it is proven that the failure analysis based on macro- and micro-mechanics model is feasible and efficient.

  15. Damage analysis and fundamental studies. Quarterly progress report, July--September 1978

    Energy Technology Data Exchange (ETDEWEB)

    Zwilsky, Klaus M.

    1979-05-01

    This report is the third in a series of Quarterly Technical Progress Reports on Damage Analysis and Fundamental Studies (DAFS) which is one element of the Fusion Reactor Materials Program, conducted in support of the Magnetic Fusion Energy Program. This report is organized along topical lines in parallel to Section II, Damage Analysis and Fundamental Studies (DOE/ET-0032/2), of the Fusion Reactor Materials Program Plan so that activities and accomplishments may be followed readily relative to that Program Plan. Thus, the work of a given laboratory may appear throughout the report. Chapters 1 and 2 report topics which are generic to all of the DAFS Program: DAFS Task Group Activities and Irradiation Test Facilities, respectively. Chapters 3, 4, and 5 report the work that is specific to each of the subtasks around which the program is structured: A) Environmental Characterization, B) Damage Production, and C) Damage Microstructure Evolution and Mechanical Behavior.

  16. Technical progress and energy substitutions in transport sector; Progres techniques et substitutions energetiques dans le secteur des transports

    Energy Technology Data Exchange (ETDEWEB)

    Florane, Philippe

    2002-11-15

    Alternative motorization technologies have been proposed in order to achieve energy diversification and a reduction in pollutant emissions. Fuel cell vehicles are, among others, at the centre of research carried out by car manufacturers and oil companies. The use of fuel cell vehicles could contribute, first, to a less stringent long-term energy dependence of oil importing countries and, second, to pollutant reduction in the transport sector. First of all, we propose the definition of 'innovation' and its treatment in the frame of mainstream economic theories. Then we proceed to a retrospective analysis of diesel motorization of the car market. In the second part of our work, we conduct a survey among French households aiming to obtain up-to-date information about their degree of acceptance of fuel cell technology. We are concerned about highlighting the determining factors of fuel cell vehicle adoption by consumers. For this, we set up a discrete choice model linking the individual decision to the whole group of technical or socio-economical factors and characteristics. Finally, we develop patterns of fuel cell equipment of passenger cars which differ according to type of vehicle and possible purchase assistance. These patterns lead us to the analysis of long-term fuel cell vehicle development on the French car market. (author)

  17. Application of a sensitivity analysis technique to high-order digital flight control systems

    Science.gov (United States)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  18. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    International Nuclear Information System (INIS)

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  19. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: bayout@cnen.gov.br, e-mail: rfonseca@cnen.gov.br

    2009-07-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  20. The Recent Technique Trend and Progress of Quantum Metrology on Electricity%电学量子计量技术的进展与趋势

    Institute of Scientific and Technical Information of China (English)

    黄晓钉; 蔡建臻; 王路

    2012-01-01

    介绍了电学计量在直流电压、直流电阻量子化溯源的新进展,在交流电压、交流电阻量子化溯源的新突破,以及电学量子三角形闭合互证的新方法,并对电学量子计量技术在质量单位定义和普朗克常数测定中发挥的作用进行了描述.%Recent progress of DC voltage and Quantum hall resistance of electrical metrology are introduced. The AC voltage and AC Quantum hall resistance have some breakthrough, which are discussed here. Moreover, electrical quantum triangle relation and its verification are present in this paper. Finally , the using in mass unit definition and Plank constant measurement of electrical metrology technique is also introduced here.

  1. Developments in sanitary techniques 2011-2012. Important progress through studies in 2011; Ontwikkelingen sanitaire technieken 2011-2012. Belangrijke vorderingen door studies in 2011

    Energy Technology Data Exchange (ETDEWEB)

    Scheffer, W.

    2011-12-15

    In 2011, new laws and regulations were the main theme in sanitary techniques (ST). Libraries have been updated in the areas of tap water installations and sewer systems of buildings. Some important progress was made in the framework of several ST preliminary studies conducted by TVVL and Uneto-VNI. Still, the start-up of new ST studies and projects in 2012 is lagging behind compared to previous years. [Dutch] Het vakgebied van sanitaire technieken (ST) stond in 2011 vooral in het teken van nieuwe wet- en regelgeving. Zowel op het gebied van leidingwaterinstallaties als riolering van bouwwerken zijn de bibliotheken geactualiseerd. In het kader van enkele ST-voorstudies, uitgevoerd door TVVL en Uneto-VNI zijn belangrijke vorderingen gemaakt. De opstart van nieuwe ST-studies en -projecten in 2012 blijft echter achter ten opzichte van voorgaande jaren.

  2. Progress of Polymer Reactive Extrusion Technique and its Applications%聚合物反应挤出技术及其应用研究进展

    Institute of Scientific and Technical Information of China (English)

    何明; 尹国强

    2012-01-01

    The general principle, advantages, and disadvantages of polymer reactive extrusion technique (REX) briefly introduced. controlled degrada The latest applications progress of REX in bulk polymerization, grafting, blending modification tion were discussed. were , and%聚合物反应挤出技术是一门将聚合反应与挤出成型结合在一起的新兴工艺,简述了聚合物反应挤出技术的原理及特点,综述了聚合物反应挤出技术在本体聚合、接枝反应、反应共混、以及可控降解等方面的应用研究新进展。

  3. Choosing a DIVA: a comparison of emerging digital imagery vegetation analysis techniques

    Science.gov (United States)

    Jorgensen, Christopher F.; Stutzman, Ryan J.; Anderson, Lars C.; Decker, Suzanne E.; Powell, Larkin A.; Schacht, Walter H.; Fontaine, Joseph J.

    2013-01-01

    Question: What is the precision of five methods of measuring vegetation structure using ground-based digital imagery and processing techniques? Location: Lincoln, Nebraska, USA Methods: Vertical herbaceous cover was recorded using digital imagery techniques at two distinct locations in a mixed-grass prairie. The precision of five ground-based digital imagery vegetation analysis (DIVA) methods for measuring vegetation structure was tested using a split-split plot analysis of covariance. Variability within each DIVA technique was estimated using coefficient of variation of mean percentage cover. Results: Vertical herbaceous cover estimates differed among DIVA techniques. Additionally, environmental conditions affected the vertical vegetation obstruction estimates for certain digital imagery methods, while other techniques were more adept at handling various conditions. Overall, percentage vegetation cover values differed among techniques, but the precision of four of the five techniques was consistently high. Conclusions: DIVA procedures are sufficient for measuring various heights and densities of standing herbaceous cover. Moreover, digital imagery techniques can reduce measurement error associated with multiple observers' standing herbaceous cover estimates, allowing greater opportunity to detect patterns associated with vegetation structure.

  4. Technique for continuous high-resolution analysis of trace substances in firn and ice cores

    Energy Technology Data Exchange (ETDEWEB)

    Roethlisberger, R.; Bigler, M.; Hutterli, M.; Sommer, S.; Stauffer, B.; Junghans, H.G.; Wagenbach, D.

    2000-01-15

    The very successful application of a CFA (Continuous flow analysis) system in the GRIP project (Greenland Ice Core Project) for high-resolution ammonium, calcium, hydrogen peroxide, and formaldehyde measurements along a deep ice core led to further development of this analysis technique. The authors included methods for continuous analysis technique. The authors included methods for continuous analysis of sodium, nitrate, sulfate, and electrolytical conductivity, while the existing methods have been improved. The melting device has been optimized to allow the simultaneous analysis of eight components. Furthermore, a new melter was developed for analyzing firn cores. The system has been used in the frame of the European Project for Ice Coring in Antarctica (EPICA) for in-situ analysis of several firn cores from Dronning Maud Land, Antarctica, and for the new ice core drilled at Dome C, Antarctica.

  5. The Analysis of Surrounding Structure Effect on the Core Degradation Progress with COMPASS Code

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Jun Ho; Son, Dong Gun; Kim, Jong Tae; Park, Rae Jun; Kim, Dong Ha [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    In line with the importance of severe accident analysis after Fukushima accident, the development of integrated severe accident code has been launched by the collaboration of three institutes in Korea. KAERI is responsible to develop modules related to the in-vessel phenomena, while other institutes are to the containment and severe accident mitigation facility, respectively. In the first phase, the individual severe accident module has been developed and the construction of integrated analysis code is planned to perform in the second phase. The basic strategy is to extend the design basis analysis codes of SPACE and CAP, which are being validated in Korea for the severe accident analysis. In the first phase, KAERI has targeted to develop the framework of severe accident code, COMPASS (COre Meltdown Progression Accident Simulation Software), covering the severe accident progression in a vessel from a core heat-up to a vessel failure as a stand-alone fashion. In order to analyze the effect of surrounding structure, the melt progression has been compared between the central zone and the most outer zone under the condition of constant radial power peaking factor. Figure 2 and 3 shows the fuel element temperature and the clad mass at the central zone, respectively. Due to the axial power peaking factor, the axial node No.3 has the highest temperature, while the top and bottom nodes have the lowest temperature. When the clad temperature reaches to the Zr melting temperature (2129.15K), the Zr starts to melt. The axial node No.2 reaches to the fuel melting temperature about 5000 sec and the molten fuel relocates to the node No.1, which results to the blockage of flow area in node No.1. The blocked flow area becomes to open about 6100 sec due to the molten ZrO{sub 2} mass relocation to core support plate. Figure 4 and 5 shows the fuel element temperature and the clad mass at the most outer zone, respectively. It is shown that the fuel temperature increase more slowly

  6. An Efficient Auto Redundant Technique for Analysis of Single Layer Grid with Curved Members

    Directory of Open Access Journals (Sweden)

    Ashwin Hansora

    2010-07-01

    Full Text Available This paper presents an Auto Redundant Technique for analysis of grid with curved members. This technique is based on the force method, but in the technique choice of the redundant is completely eliminated. The analysis technique is found very effective, accurate and programmable. A comprehensive C++ program has been developed to compute internal forces at the end of each member of the grid for different load cases and their combinations. Presently in this paper analysis of grid is carried out with fixed support when it is subjected to concentrated point load, twisting moment, bending moment, full/partial uniformly distributed load and full/partial uniformly varying load. In this technique, any number of load cases can be accommodated without creating any additional node(s on the member. The power of the analysis procedure is effectively demonstrated through the solution of one benchmark problem. The results obtained through the program for complementary load cases are compared with the results from analysis software and are found to match.

  7. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  8. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  9. Progress report on neutron activation analysis at Dalat Nuclear Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Tuan, Nguyen Ngoc [Nuclear Research Institute, Dalat (Viet Nam)

    2003-03-01

    Neutron Activation Analysis (NAA) is one of most powerful techniques for the simultaneous multi-elements analysis. This technique has been studied and applied to analyze major, minor and trace elements in Geological, Biological and Environmental samples at Dalat Nuclear Research Reactor. At the sixth Workshop, February 8-11, 1999, Yojakarta, Indonesia we had a report on Current Status of Neutron Activation Analysis using Dalat Nuclear Research Reactor. Another report on Neutron Activation Analysis at the Dalat Nuclear Research Reactor also was presented at the seventh Workshop in Taejon, Korea from November 20-24, 2000. So in this report, we would like to present the results obtained of the application of NAA at NRI for one year as follows: (1) Determination of the concentrations of noble, rare earth, uranium, thorium and other elements in Geological samples according to requirement of clients particularly the geologists, who want to find out the mineral resources. (2) The analysis of concentration of radionuclides and nutrient elements in foodstuffs to attend the program on Asian Reference Man. (3) The evaluation of the contents of trace elements in crude oil and basement rock samples to determine original source of the oil. (4) Determination of the elemental composition of airborne particle in the Ho Chi Minh City for studying air pollution. The analytical data of standard reference material, toxic elements and natural radionuclides in seawater are also presented. (author)

  10. Seismic response analysis of RCC structure with yielding dampers using linearization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Parulekar, Y.M., E-mail: yogitap@barc.gov.i [Bhabha Atomic Research Centre, Trombay, Mumbai 400085, Maharashtra (India); Reddy, G.R.; Vaze, K.K.; Ghosh, A.K.; Kushwaha, H.S. [Bhabha Atomic Research Centre, Trombay, Mumbai 400085, Maharashtra (India); Ramesh Babu, R. [Central Power Research Institute, Bangalore (India)

    2009-12-15

    Passive energy dissipating devices like elasto-plastic dampers (EPDs) can be used for retrofitting of structures subjected to seismic loads. A model of reinforced concrete structure is tested on shake table with and without EPDs attached in its frames. Using a finite element model of the structure, linear and nonlinear time history analysis is carried out using Newmark's time integration technique. However, the most viable approach used by designers is response spectrum approach. Hence equivalent linearization techniques are used to address the nonlinearity of dampers and iterative response spectrum method is used for evaluating the response of the structure using equivalent damping and stiffness. The analytical maximum storey response of the structure is compared with experimental values and time history analysis values. It has been concluded that, iterative response spectrum technique using equivalent linearization techniques is simple and results in reasonably acceptable response of the structures retrofitted with energy dissipaters.

  11. THE ‘HYBRID’ TECHNIQUE FOR RISK ANALYSIS OF SOME DISEASES

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on the data obtained from a survey recently made in Shanghai, this paper presents the hybrid technique for risk analysis and evaluation of some diseases. After determination of main risk factors of these diseases by analysis of variance, the authors introduce a new concept ‘Illness Fuzzy Set’ and use fuzzy comprehensive evaluation to evaluate the risk of suffering from a disease for residents. Optimal technique is used to determine the weights wi in fuzzy comprehensive evaluation, and a new method ‘Improved Information Distribution’ is also introduced for the treatment of small sample problem. It is shown that the results obtained by using the hybrid technique are better than by using single fuzzy technique or single statistical method.

  12. Técnica de fechamento progressivo na laparostomia e descompressão abdominal Progressive closure technique in laparostomy and decompressive management of abdomen

    Directory of Open Access Journals (Sweden)

    Edna Delabio Ferraz

    2000-08-01

    Full Text Available É apresentada uma técnica de controle da evisceração, empregada nos casos com indicação formal de laparostomia. Através da combinação de dupla prótese temporária (tela de polipropileno e película de poliamida, realizamos o "fechamento progressivo", o qual permite a síntese primária tardia da parede abdominal, sem o risco de fístula entérica. Os resultados da primeira série de 23 pacientes (1990-1994 são comparados a outro grupo (outras técnicas de laparostomia do mesmo hospital. A mortalidade global foi de 39,1% em nossa série, contra 55,9% quando utilizadas outras técnicas (p = 0,003, não havendo diferença entre seus índices prognósticos (Apache II. Em todos os casos em que o fechamento progressivo foi concluído (22 pacientes, a síntese primária tardia da parede abdominal foi possível, mesmo que com longos períodos de laparostomia. O emprego da técnica não aumentou o tempo de internação. A reoperação programada e o controle dos focos sépticos foram mais efetivos, com uma média de revisões de cavidade igual a 6,8 contra 1,8 no outro grupo. Nenhuma fístula e nenhuma eventração decorreram do emprego da técnica. O recente emprego da técnica na Síndrome Compartimental do Abdome tem demonstrado grande benefício para estes pacientes críticos. Os novos conceitos sobre a laparostomia trouxeram importantes mudanças quanto aos seus critérios de indicação e novas exigências quanto ao seu refinamento técnico.A new technique to preserve the abdominal wall in laparostomies will be presented. This technique proposes a different solution to minimize morbidity related to laparostomy using a double temporary prosthesis of polypropylene mesh and polyamide sheet. With the use of these prosthesis materials, a minimum parietal damage was shown and this technique allowed a delayed primary synthesis of the abdominal wall through progressive closure. Our results in the first 23 patient's group (1990-1994 will be show

  13. A Comparative Analysis of Techniques for PAPR Reduction of OFDM Signals

    Directory of Open Access Journals (Sweden)

    M. Janjić

    2014-06-01

    Full Text Available In this paper the problem of high Peak-to-Average Power Ratio (PAPR in Orthogonal Frequency-Division Multiplexing (OFDM signals is studied. Besides describing three techniques for PAPR reduction, SeLective Mapping (SLM, Partial Transmit Sequence (PTS and Interleaving, a detailed analysis of the performances of these techniques for various values of relevant parameters (number of phase sequences, number of interleavers, number of phase factors, number of subblocks depending on applied technique, is carried out. Simulation of these techniques is run in Matlab software. Results are presented in the form of Complementary Cumulative Distribution Function (CCDF curves for PAPR of 30000 randomly generated OFDM symbols. Simulations are performed for OFDM signals with 32 and 256 subcarriers, oversampled by a factor of 4. A detailed comparison of these techniques is made based on Matlab simulation results.

  14. A new computational technique for the stability analysis of slender rods

    Science.gov (United States)

    Sinha, S. C.; Liu, Tai-Sheng; Senthilnathan, N. R.

    1992-07-01

    A new computational technique for the stability analysis of slender rods with variable cross-sections under general loading conditions is presented. In this approach, the dependent variable and the variable coefficients appearing in the governing equations are expanded in a finite series of Chebyshev polynomials. The main feature of this technique is that the original boundary value problem associated with the differential equation is reduced to an algebraic eigenvalue problem. The proposed technique is applied to study the static buckling of Euler column and the flutter behavior of a cantilevel column subjected to uniformly distributed tangential loading. The numerical results from the suggested technique are found to be extremely accurate when compared to other techniques available in literature. It is shown that this approach can also be employed in a symbolic form. The merits of the present method in comparison to the standard solution procedures like finite difference and Galerkin methods are discussed.

  15. Fault detection in digital and analog circuits using an i(DD) temporal analysis technique

    Science.gov (United States)

    Beasley, J.; Magallanes, D.; Vridhagiri, A.; Ramamurthy, Hema; Deyong, Mark

    1993-01-01

    An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.

  16. Virtual Reality Analysis in Tennis Serve Technique Stability for Junior Masters

    OpenAIRE

    Du Chuan Jia; Zhou Ji He; Wang Shuai

    2016-01-01

    This study consists of virtual reality analysis of tennis serve technique was constructed based on two male professional players from 2016 Chengdu ITF Junior Masters: Casper RUUD and Miomir Kecmanovic. The purpose of the study is to find an effective way of building stability of serves for junior players. This study will provide some considerable data for coaches and players in improving the stability and quality of the serve technique. Results of the study show 5 main points of view: (1) RUU...

  17. THE RESEARCH TECHNIQUES FOR ANALYSIS OF MECHANICAL AND TRIBOLOGICAL PROPERTIES OF COATING-SUBSTRATE SYSTEMS

    OpenAIRE

    Kinga CHRONOWSKA-PRZYWARA; Marcin KOT; Sławomir ZIMOWSKI

    2014-01-01

    The article presents research techniques for the analysis of both mechanical and tribological properties of thin coatings applied on highly loaded machine elements. In the Institute of Machine Design and Exploitation, AGH University of Science and Technology students of the second level of Mechanical Engineering study tribology attending laboratory class. Students learn on techniques for mechanical and tribological testing of thin, hard coatings deposited by PVD and CVD technologies. The prog...

  18. Analysis and simulation of brain signal data by EEG signal processing technique using MATLAB

    OpenAIRE

    Sasikumar Gurumurthy; Vudi Sai Mahit; Rittwika Ghosh

    2013-01-01

    EEG is brain signal processing technique that allows gaining the understanding of the complex inner mechanisms of the brain and abnormal brain waves have shown to be associated with particular brain disorders. The analysis of brain waves plays an important role in diagnosis of different brain disorders. MATLAB provides an interactive graphic user interface (GUI) allowing users to flexiblyand interactively process their high-density EEG dataset and other brain signal data different techniques ...

  19. [The applying and foreground of quantifying DNA content by image analysis technique in determining postmortem interval].

    Science.gov (United States)

    Wang, Cheng-yi; Liu, Liang

    2002-02-01

    Image Analysis Technique(IAT) was developed at 1950's, which quantifies the changing all the part of image by sampling, processing, quantifying, computing, analyzing the information of image. And now it has become a normal quantifying technique in biology and medicine research. In the present paper, we reviewed briefly the principium of quantifying the DNA content by IAT, the law of degradation of DNA in nucleus and the foreground of this method in determining PMI in forensic pathology.

  20. Application of slip-band visualization technique to tensile analysis of laser-welded aluminum alloy

    OpenAIRE

    Muchiar, Ir.; Yoshida, S.; Widiastuti, R.; Kusnovo, A.; Takahashi, K; Sato, S.

    1996-01-01

    Recently we have developed a new optical interferometric technique capable of visualizing slip band occurring in a deforming solid-state object. In this work we applied this technique to a tensile analysis of laser-welded aluminum plate samples, and successfully revealed stress concentration that shows strong relationships with the tensile strength and the fracture mechanism. We believe that this method is a new, convenient way to analyze the deformation characteristics of welded objects and ...