WorldWideScience

Sample records for analysis techniques progress

  1. Research Progress on Pesticide Residue Analysis Techniques in Agro-products

    Directory of Open Access Journals (Sweden)

    HE Ze-ying

    2016-07-01

    Full Text Available There are constant occurrences of acute pesticide poisoning among consumers and pesticide residue violations in agro-products import/export trading. Pesticide residue analysis is the important way to protect the food safety and the interest of import/export enterprises. There has been a rapid development in pesticide residue analysis techniques in recent years. In this review, the research progress in the past five years were discussed in the respects of samples preparation and instrument determination. The application, modification and development of the QuEChERS method in samples preparation and the application of tandem mass spectrometry and high resolution mass spectrometry were reviewed. And the implications for the future of the field were discussed.

  2. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1978-July 31, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Finston, H. L.; Williams, E. T.

    1979-07-01

    Studies of homogeneous liquid-liquid extraction have been extended to include (1) a detailed determination of the phase diagram of the system propylene carbonate-water, (2) the extraction of a large variety of both monodentate and bidentate iron complexes, (3) the solvent extraction characteristics of analogues of propylene carbonate, (4) the behavior under pressure of the propylene carbonate water system, and (5) the extraction behavior of alkaline earth - TTA chelates. One consequence of these studies was the observation that the addition of ethanol to propylene carbonate-water or to isobutylene carbonate-water yields a single homogeneous phase. Subsequent evaporation of the ethanol restores the two immiscible phases. Past neutron activation analysis has been attempted for the heavy elements Pb, Bi, Tl at the Brookhaven HFBR (in- or near-core position) and at the Brookhaven CLIF facility. The latter appears more promising and we have initiated a collaborative program to use the CLIF facility. A milking system which can provide ca. 16 ..mu..Ci of carrier-free /sup 212/Pb was developed for use in an isotope dilution technique for lead. Collaboration with laboratories already determining trace lead by flameless Atomic Absorption or by concentration by electrodeposition into a hanging drop followed by Anodic stripping will be proposed. The Proton X-Ray Emission system has undergone marked improvement with the acquisition of a new high resolution Si(Li) detector and a new multi-channel analyzer system. Various techniques have been explored to dissolve and prepare samples for PIXE analysis and also for verification by Atomic Absorption analysis.

  3. Progress of neutron induced prompt gamma analysis technique in 1988~2003

    Institute of Scientific and Technical Information of China (English)

    JING Shi-Wei; LIU Yu-Ren; CHI Yan-Tao; TIAN Yu-Bing; CAO Xi-Zheng; ZHAO Xin-Hui; REN Wan-Bin; LIU Lin-Mao

    2004-01-01

    This paper describes new development of the neutron induced prompt gamma-ray analysis (NIPGA) technology in 1988~2003. The pulse fast-thermal neutron activation analysis method, which utilized the inelastic re action and capture reaction jointly, was employed to measure the elemental contents more efficiently. Lifetime of the neutron generator was more than 10000h and the performance of detector and MCA reached a high level. At the same time, Monte Carlo library least-square method was used to solve the nonlinearity problem in the NIPGA.

  4. Progress in automation, robotics and measuring techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2015-01-01

    This book presents recent progresses in control, automation, robotics, and measuring techniques. It includes contributions of top experts in the fields, focused on both theory and industrial practice. The particular chapters present a deep analysis of a specific technical problem which is in general followed by a numerical analysis and simulation, and results of an implementation for the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be useful for both researchers working in the area of engineering sciences and for practitioners solving industrial problems.    .

  5. Progress in application of CFD techniques

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Computational Fluid Dynamics (CFD) is an important branch of fluid mechanics, and will continue to play great roles on the design of aerospace vehicles, explora- tion of new concept vehicles and new aerodynamic technology. This paper will present the progress of CFD from point of view of engineering application in recent years at CARDC, including the software integration, grid technique, speeding up of convergence, unsteady fluid computation,etc., and also give some engineering application examples of CFD at CARDC.

  6. [Idiopathic Progressive Subglottic Stenosis: Surgical Techniques].

    Science.gov (United States)

    Hoetzenecker, K; Schweiger, T; Klepetko, W

    2016-09-01

    Idiopathic subglottic stenosis is a disease characterized by slow, progressive scarring and constriction of the subglottic airway. It almost always occurs in females between the 3rd and 5th decade of life. Symptoms are frequently misinterpreted as asthma and patients are referred for endoscopic evaluation only when asthma medications fail to alleviate their symptoms. Treatment options can be divided into endoscopic and open surgical techniques. Microlaryngoscopic scar reduction by laser followed by balloon dilation usually delivers good short-term results. However, the majority of patients will experience restenosis within a short period of time. Open surgical correction techniques are based on a complete removal of the affected airway segment. This must be combined with various extended resection techniques in patients with advanced stenosis. Depending on the extent and severity of the stenosis the following surgical techniques are required: standard cricotracheal resection (Grillo's technique), cricoplasty with dorsal and lateral mucosaplasty, or a combination of resection and enlargement techniques using rib cartilage grafts. In experienced centres, success rates of over 95 % are reported with good functional outcome of voice and deglutition.

  7. Progress in application of CFD techniques

    Institute of Scientific and Technical Information of China (English)

    CHEN ZuoBin; JIANG Xiong; ZHOU Zhu; XIAO HanShan; HUANG Yong; MOU Bin; XIAO ZhongYun; LIU Gang; WANG YunTao

    2008-01-01

    Computational Fluid Dynamics (CFD) is an important branch of fluid mechanics,and will continue to play great roles on the design of aerospace vehicles,exploration of new concept vehicles and new aerodynamic technology.This paper will present the progress of CFD from point of view of engineering application in recent years at CARDC,including the software integration,grid technique,speeding up of convergence,unsteady fluid computation,etc.,and also give some engineering application examples of CFD at CARDC.

  8. Granulation techniques and technologies: recent progresses.

    Science.gov (United States)

    Shanmugam, Srinivasan

    2015-01-01

    Granulation, the process of particle enlargement by agglomeration technique, is one of the most significant unit operations in the production of pharmaceutical dosage forms, mostly tablets and capsules. Granulation process transforms fine powders into free-flowing, dust-free granules that are easy to compress. Nevertheless, granulation poses numerous challenges due to high quality requirement of the formed granules in terms of content uniformity and physicochemical properties such as granule size, bulk density, porosity, hardness, moisture, compressibility, etc. together with physical and chemical stability of the drug. Granulation process can be divided into two types: wet granulation that utilize a liquid in the process and dry granulation that requires no liquid. The type of process selection requires thorough knowledge of physicochemical properties of the drug, excipients, required flow and release properties, to name a few. Among currently available technologies, spray drying, roller compaction, high shear mixing, and fluid bed granulation are worth of note. Like any other scientific field, pharmaceutical granulation technology also continues to change, and arrival of novel and innovative technologies are inevitable. This review focuses on the recent progress in the granulation techniques and technologies such as pneumatic dry granulation, reverse wet granulation, steam granulation, moisture-activated dry granulation, thermal adhesion granulation, freeze granulation, and foamed binder or foam granulation. This review gives an overview of these with a short description about each development along with its significance and limitations.

  9. [Progress in transgenic fish techniques and application].

    Science.gov (United States)

    Ye, Xing; Tian, Yuan-Yuan; Gao, Feng-Ying

    2011-05-01

    Transgenic technique provides a new way for fish breeding. Stable lines of growth hormone gene transfer carps, salmon and tilapia, as well as fluorescence protein gene transfer zebra fish and white cloud mountain minnow have been produced. The fast growth characteristic of GH gene transgenic fish will be of great importance to promote aquaculture production and economic efficiency. This paper summarized the progress in transgenic fish research and ecological assessments. Microinjection is still the most common used method, but often resulted in multi-site and multi-copies integration. Co-injection of transposon or meganuclease will greatly improve the efficiency of gene transfer and integration. "All fish" gene or "auto gene" should be considered to produce transgenic fish in order to eliminate misgiving on food safety and to benefit expression of the transferred gene. Environmental risk is the biggest obstacle for transgenic fish to be commercially applied. Data indicates that transgenic fish have inferior fitness compared with the traditional domestic fish. However, be-cause of the genotype-by-environment effects, it is difficult to extrapolate simple phenotypes to the complex ecological interactions that occur in nature based on the ecological consequences of the transgenic fish determined in the laboratory. It is critical to establish highly naturalized environments for acquiring reliable data that can be used to evaluate the environ-mental risk. Efficacious physical and biological containment strategies remain to be crucial approaches to ensure the safe application of transgenic fish technology.

  10. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  11. Progress involving new techniques for liposome preparation

    Directory of Open Access Journals (Sweden)

    Zhenjun Huang

    2014-08-01

    Full Text Available The article presents a review of new techniques being used for the preparation of liposomes. A total of 28 publications were examined. In addition to the theories, characteristics and problems associated with traditional methods, the advantages and drawbacks of the latest techniques were reviewed. In the light of developments in many relevant areas, a variety of new techniques are being used for liposome preparation and each of these new technique has particular advantages over conventional preparation methods. However, there are still some problems associated with these new techniques that could hinder their applications and further improvements are needed. Generally speaking, due to the introduction of these latest techniques, liposome preparation is now an improved procedure. These applications promote not only advances in liposome research but also the methods for their production on an industrial scale.

  12. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  13. Granulation techniques and technologies: recent progresses

    OpenAIRE

    Srinivasan Shanmugam

    2015-01-01

    Granulation, the process of particle enlargement by agglomeration technique, is one of the most significant unit operations in the production of pharmaceutical dosage forms, mostly tablets and capsules. Granulation process transforms fine powders into free-flowing, dust-free granules that are easy to compress. Nevertheless, granulation poses numerous challenges due to high quality requirement of the formed granules in terms of content uniformity and physicochemical proper...

  14. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  15. Research Progress on Technique of Frozen Embryo Transfer in Sheep

    Institute of Scientific and Technical Information of China (English)

    SHE Qiu-sheng; HU Jian-ye; LOU Peng-yan; TAO Jing; XIE Zhao-hui

    2011-01-01

    The paper introduced the research progress on the technique of frozen embryo transfer in sheep, illustrated selection of donors and receptors, superovulation, synchronization of estrus, embryo cryopreservation and embryo transplantation. Frozen embryo transfer in sheep is another breakthrough in the high-quality sheep raising, and this technique in China is in its infancy recommendation stage, but it will be comprehensively popularized in the future.

  16. The Progress on Laser Surface Modification Techniques of Titanium Alloy

    Institute of Scientific and Technical Information of China (English)

    LIANG Cheng; PAN Lin; Al Ding-fei; TAO Xi-qi; XIA Chun-huai; SONG Yan

    2004-01-01

    Titanium alloy is widely used in aviation, national defence, automobile, medicine and other fields because of their advantages in lower density, corrosion resistance, and fatigue resistance etc. As titanium alloy is higher friction coefficients, weak wear resistance, bad high temperature oxidation resistance and lower biocompatibility, its applications are restricted. Using laser surface modification techniques can significantly improve the surface properties of titanium alloy. a review is given for progress on laser surface modification techniques of titanium alloy in this paper.

  17. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  18. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  19. 样品前处理技术在气相色谱分析中的应用进展%Progress of sample preparation techniques in gas chromatographic analysis

    Institute of Scientific and Technical Information of China (English)

    严矿林; 林丽琼; 郑夏汐; 肖小华; 曹玉娟

    2013-01-01

    Gas chromatography (GC) is a widely used analytical technique in many fields.Sample preparation is very important in GC analysis due to its time consumed and deviations produced.In the present paper,the progress of some typical sample preparation techniques in gas chromatography,including purge and trap,solid phase extraction,solid phase microextraction,liquid phase microextraction,microwave assisted extraction,ultrasonic-assisted extraction,etc.,are reviewed.%气相色谱法是当前应用最广泛的分析技术之一.使用气相色谱对复杂基体进行分析时的样品前处理步骤往往繁琐耗时,易引起误差,已成为制约分析效率和准确度提升的关键环节.本文综述了2009-2013年几种主要的样品前处理技术,包括吹扫捕集、固相萃取、固相微萃取、液相微萃取技术以及微波辅助萃取、超声波辅助萃取等场辅助萃取技术在气相色谱分析中的应用研究进展.

  20. Triangulation of Data Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Lauri, M

    2011-10-01

    Full Text Available In psychology, as in other disciplines, the concepts of validity and reliability are considered essential to give an accurate interpretation of results. While in quantitative research the idea is well established, in qualitative research, validity and reliability take on a different dimension. Researchers like Miles and Huberman (1994 and Silverman (2000, 2001, have shown how these issues are addressed in qualitative research. In this paper I am proposing that the same corpus of data, in this case the transcripts of focus group discussions, can be analysed using more than one data analysis technique. I refer to this idea as ‘triangulation of data analysis techniques’ and argue that such triangulation increases the reliability of the results. If the results obtained through a particular data analysis technique, for example thematic analysis, are congruent with the results obtained by analysing the same transcripts using a different technique, for example correspondence analysis, it is reasonable to argue that the analysis and interpretation of the data is valid.

  1. 水中挥发性有机物分析前处理技术研究进展%Progress in Pre-treatment Techniques for the Analysis of Volatile Organic Compounds in Environmental Water

    Institute of Scientific and Technical Information of China (English)

    马康; 张金娜; 何雅娟; 弓爱君

    2011-01-01

    There are various types of volatile organic compounds(VOCs) in environmental water.These compounds with lower threshold concentration usually exist in water at the level of ng/L~μg/L.In this paper,the progresses of pretreatment ment techniques for the analysis of VOCs from 2003 are reviewed.Eight sample preparation methods including: headspace-sigle drop microextraction(HS-SDME),hollow fiber-liquid phase microextraction(HF-LPME),dispersive liquid-liquid microextraction(DLLME),head space-solid phase microextraction(HS-SPME),stir bar sorptive extraction(SBSE),static headspace(HS),purge and trap(PT),and inside needle capillary adsorption trap(INCAT) are in troduced.The advantages and disadvantages of each technique are presented,and the prospect for the development of analysis methods for VOCs is discussed.%环境水体中挥发性有机物(VOCs)的种类繁多,含量在(ng/L~μg/L)范围内。本文总结了2003年以来测定水样中VOCs的8种前处理技术,包括顶空单液滴微萃取(HS-SDME)、中空纤维液相微萃取(HF-LPME)、分散液液微萃取(DLLME)、顶空固相微萃取(HS-SPME)、搅拌棒吸附萃取(SBSE)、静态顶空(HS)、吹扫捕集法(P&T)和针式毛细管吸附阱(INCAT)的进展情况。比较了它们的优缺点,并展望了VOCs的分析方法。

  2. Foreign Language Analysis and Recognition (FLARE) Progress

    Science.gov (United States)

    2015-02-01

    was considered optimal. The initial phase allows for an image to be uploaded into the Haystack system, but the second stage is prompted by a command...AFRL-RH-WP-TR-2015-0007 FOREIGN LANGUAGE ANALYSIS AND RECOGNITION (FLARE) PROGRESS Brian M. Ore Stephen A. Thorn David M...October 2012 – 30 November 2014 4. TITLE AND SUBTITLE Foreign Language Analysis and Recognition (FLARe) Progress 5a. CONTRACT NUMBER FA8650

  3. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  4. Application of Electromigration Techniques in Environmental Analysis

    Science.gov (United States)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  5. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  6. Key Techniques and Application Progress of Molecular Pharmacognosy

    Institute of Scientific and Technical Information of China (English)

    XIAO Xue-feng; HU Jing; XU Hai-yu; GAO Wen-yuan; ZHANG Tie-jun; LIU Chang-xiao

    2011-01-01

    At the boundary between pharmacognosy and molecular biology, molecular pharmacognosy has developed as a new borderline discipline. This paper reviews the methods, application, and prospect of molecular pharmacognosy. DNA marker is one of genetic markers and some molecular marker methods which have been successfully used for genetic diversity identification and new medicinal resources development. Recombinant DNA technology provides a powerful tool that enables scientists to engineer DNA sequences. Gene chip technique could be used in determination of gene expression profiles, analyses of polymorphisms, construction of genomic library, analysis of mapping, and sequencing by hybridization. Using the methods and theory of molecular biology and pharmacognosy, molecular pharmacognosy represents an extremely prospective branch of pharmacognosy and focuses on the study of systemic growth of medicinal plants, identification and evaluation of germplasm resources, plant metabolomics and production of active compounds. Furthermore, the great breakthrough of molecular pharmacognosy could be anticipated on DNA fingerprint analysis, cultivar improvement, DNA identification, and a global DNA barcoding system in the future.

  7. Hollow Rotor Progressing Cavity Pump Technique for Oil Production

    Institute of Scientific and Technical Information of China (English)

    Cao Gang

    2002-01-01

    @@ Features of Hollow RotorProgressing Cavity Pump(HRPCP) (1) Keep the path forPCP well-flushing.Clean over the producing wells quickly without shutting off the wells. Heat loss is low while the efficiency is high.

  8. Prefractionation techniques in proteome analysis.

    Science.gov (United States)

    Righetti, Pier Giorgio; Castagna, Annalisa; Herbert, Ben; Reymond, Frederic; Rossier, Joël S

    2003-08-01

    The present review deals with a number of prefractionation protocols in preparation for two-dimensional map analysis, both in the fields of chromatography and in the field of electrophoresis. In the first case, Fountoulaki's groups has reported just about any chromatographic procedure useful as a prefractionation step, including affinity, ion-exchange, and reversed-phase resins. As a result of the various enrichment steps, several hundred new species, previously undetected in unfractionated samples, could be revealed for the first time. Electrophoretic prefractionation protocols include all those electrokinetic methodologies which are performed in free solution, essentially all relying on isoelectric focusing steps. The devices here reviewed include multichamber apparatus, such as the multicompartment electrolyzer with Immobiline membranes, Off-Gel electrophoresis in a multicup device and the Rotofor, an instrument also based on a multichamber system but exploiting the conventional technique of carrier-ampholyte-focusing. Other instruments of interest are the Octopus, a continuous-flow device for isoelectric focusing in a upward flowing liquid curtain, and the Gradiflow, where different pI cuts are obtained by a multistep passage through two compartments buffered at different pH values. It is felt that this panoply of methods could offer a strong step forward in "mining below the tip of the iceberg" for detecting the "unseen proteome".

  9. Progress toward the analysis of complex propulsion installation flow phenomenon

    Science.gov (United States)

    Kern, P. R. A.; Hopcroft, R. G.

    1983-01-01

    A trend toward replacement of parametric model testing with parametric analysis for the design of aircraft is driven by the rapidly escalating cost of wind tunnel testing, the increasing availability of large fast computers, and powerful numerical flow algorithms. In connection with the complex flow phenomena characteristic of propulsion installations, it is now necessary to employ both parametric analysis and testing for design procedures. Powerful flow analysis techniques are available to predict local flow phenomena. However, the employment of these techniques is very expensive. It is, therefore, necessary to link these analyses with less powerful and less expensive procedures for an accurate analysis of propulsion installation flowfields. However, the interfacing and coupling processes needed are not available. The present investigation is concerned with progress made regarding the development of suitable linking methods. Attention is given to methods of analysis for predicting the flow around a nacelle coupled to a highly swept wing.

  10. The Analysis of Thematic Progression Patterns of English Advertisement

    Institute of Scientific and Technical Information of China (English)

    徐倩; 郭鸿雁

    2014-01-01

    Thematic Progression Patterns are the principal base for English advertisement analysis. Nowadays, it has attracted many experts in this field. Thematic Progression plays very important roles in the creation, development and establishment of English advertisement. This paper introduces four main types of Thematic Progression patterns and the analysis of English adver-tisement from Thematic Progression perspective.

  11. Progress in the NNPDF global analysis

    CERN Document Server

    Deans, Christopher S

    2013-01-01

    We report on recent progress in the NNPDF framework of global PDF analysis. The NNPDF2.3 set is the first and only available PDF set with includes LHC data. A recent benchmark comparison of NNPDF2.3 and all other modern NNLO PDF sets with LHC data was performed. We have also studied theoretical uncertainties due to heavy quark renormalization schemes, higher twists and deuterium corrections in PDFs. Finally, we report on the release of positive definite PDF sets, based on the NNPDF2.3 analysis, specially suited for use in Monte Carlo event generators.

  12. Recent Progress in Electrical Insulation Techniques for HTS Power Apparatus

    Science.gov (United States)

    Hayakawa, Naoki; Kojima, Hiroki; Hanai, Masahiro; Okubo, Hitoshi

    This paper describes the electrical insulation techniques at cryogenic temperatures, i.e. Cryodielectrics, for HTS power apparatus, e.g. HTS power transmission cables, transformers, fault current limiters and SMES. Breakdown and partial discharge characteristics are discussed for different electrical insulation configurations of LN2, sub-cooled LN2, solid, vacuum and their composite insulation systems. Dynamic and static insulation performances with and without taking account of quench in HTS materials are also introduced.

  13. Progressive Damage Analysis of Bonded Composite Joints

    Science.gov (United States)

    Leone, Frank A., Jr.; Girolamo, Donato; Davila, Carlos G.

    2012-01-01

    The present work is related to the development and application of progressive damage modeling techniques to bonded joint technology. The joint designs studied in this work include a conventional composite splice joint and a NASA-patented durable redundant joint. Both designs involve honeycomb sandwich structures with carbon/epoxy facesheets joined using adhesively bonded doublers.Progressive damage modeling allows for the prediction of the initiation and evolution of damage within a structure. For structures that include multiple material systems, such as the joint designs under consideration, the number of potential failure mechanisms that must be accounted for drastically increases the complexity of the analyses. Potential failure mechanisms include fiber fracture, intraply matrix cracking, delamination, core crushing, adhesive failure, and their interactions. The bonded joints were modeled using highly parametric, explicitly solved finite element models, with damage modeling implemented via custom user-written subroutines. Each ply was discretely meshed using three-dimensional solid elements. Layers of cohesive elements were included between each ply to account for the possibility of delaminations and were used to model the adhesive layers forming the joint. Good correlation with experimental results was achieved both in terms of load-displacement history and the predicted failure mechanism(s).

  14. Failure Analysis Seminar: Techniques and Teams. Seminar Notes. Volume I.

    Science.gov (United States)

    1981-01-01

    and Progress - Evaluate 7* 6 *~ 0 6 9 9 S 9 FAILURE ANALYSIS STRATEGY1 Augustine E. Magistro *. Introduction A primary task of management and systems...by Augustine Magistro , Picatinny Arsenal and Lawrence R. Seggel, U. S. Army Missile Command. The report Is available from the National Technical...to emphasize techniques - Identification and improvement of your leadership styles 2I BIOGRAPHIC SKETCHES: A.E. "Gus" Magistro - Systems Evaluation

  15. 基于ESR技术的烟气中自由基分析的研究进展%Research Progress of the Analysis on Free Radicals in Cigarette Smoke Based on ESR Techniques

    Institute of Scientific and Technical Information of China (English)

    宋风忠; 谷令彪; 朱晓蕊

    2011-01-01

    电子自旋共振(electron spin resonance,ESR)又称电子顺磁共振(electron paramagnetic resonance,EPR),是检测自由基最直接有效的方法.自由基是香烟烟气中最主要的一类有害物质,会直接和间接攻击细胞成分,导致人体组织和细胞的氧化,损害细胞膜上的脂类和蛋白质,引发癌症等疾病.笔者述评了ESR技术与自旋捕获技术相结合应用于香烟烟气中有害自由基的研究进展.%Electron Spin Resonance (ESR) is also called Electron Paramagnetic Resonance (EPR) ,which is the most direct and effective meth od for the detection of free radicals. As one of the major harmful substances in cigarette smoke,free radicals produce direct or indirect attack on cellular components,leading to the oxidation of human tissues and cells, damaging the lipids and proteins of membrane, and causing cancer and other diseases. Here the research progress on the combined application of ESR technique and spin trapping method in detecting the free radicals in cigarette smoke is described and evaluated.

  16. Environmental Contaminants in Hospital Settings and Progress in Disinfecting Techniques

    Directory of Open Access Journals (Sweden)

    Gabriele Messina

    2013-01-01

    Full Text Available Medical devices, such as stethoscopes, and other objects found in hospital, such as computer keyboards and telephone handsets, may be reservoirs of bacteria for healthcare-associated infections. In this cross-over study involving an Italian teaching hospital we evaluated microbial contamination (total bacterial count (TBC at 36°C/22°C, Staphylococcus spp., moulds, Enterococcus spp., Pseudomonas spp., E. coli, total coliform bacteria, Acinetobacter spp., and Clostridium difficile of these devices before and after cleaning and differences in contamination between hospital units and between stethoscopes and keyboards plus handsets. We analysed 37 telephone handsets, 27 computer keyboards, and 35 stethoscopes, comparing their contamination in four hospital units. Wilcoxon signed-rank and Mann-Whitney tests were used. Before cleaning, many samples were positive for Staphylococcus spp. and coliforms. After cleaning, CFUs decreased to zero in most comparisons. The first aid unit had the highest and intensive care the lowest contamination (P<0.01. Keyboards and handsets had higher TBC at 22°C (P=0.046 and mould contamination (P=0.002 than stethoscopes. Healthcare professionals should disinfect stethoscopes and other possible sources of bacterial healthcare-associated infections. The cleaning technique used was effective in reducing bacterial contamination. Units with high patient turnover, such as first aid, should practise stricter hygiene.

  17. Recent Progress in Synthesis Techniques of Microstrip Bandpass Filter

    Directory of Open Access Journals (Sweden)

    Navita Singh

    2012-03-01

    Full Text Available End-coupled resonator bandpass filters built in microstrip are investigated. The admittance inverter parameters of coupling gaps between resonant sections are deduced from experiment and bandpass filter design rules are developed. This allows easy filter synthesis from “prototype” low-pass designs. Design techniques which were formerly employed in the realization of waveguide and coaxial filters have been applied in the synthesis of strip-line filters having “maximally-flat” and Tchebycheff response characteristics. In this paper, Tchebycheff response characteristics considered for realizing the required circuit parameters in strip line and we would like to give a way to conceive, design bandpass filter for the X-bands and C-band at the frequencies 10.7GHz and 6.2 GHz respectively with three-pole end-coupled microstrip filters, whichdesigned filters for Radar and GSO satellites and which used the capacitive resonators and stepped impedance resonators for filter realization. Therefore, by extension, the RF/microwave applications can be referred to as communications, and other that explore the usage of frequency spectrums, some of these frequency spectrums are further divided into many frequency bands. The design and simulation are performed using 3D full wave electromagnetic simulator IE3D.

  18. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  19. Project manager insights: An analysis of career progression

    Directory of Open Access Journals (Sweden)

    James W Marion

    2014-08-01

    Full Text Available The project manager is key to the success of any project.  But the path to becoming a successful project manager is ill defined.  In this study, the authors analyzed interview results of 87 project managers’ responses to questions associated with entry into the field, career progression, and advice for the new project manager, seeking to better understand practicing project manager career progression.  Qualitative analysis techniques were used to identify recurring themes from the interview summaries. The themes and the resulting conceptual framework provide evidence that supports the development of successful project manager career path. Further, the results suggest individual project management competencies in soft skills as a key enabler of project execution.

  20. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  1. Progress of the technique of coal microwave desulfurization

    Institute of Scientific and Technical Information of China (English)

    Xiuxiang Tao; Ning Xu; Maohua Xie; Longfei Tang

    2014-01-01

    With the advantages of its fast speed, effective and moderate controllable conditions, desulfurization of coal by microwave has become research focus in the field of clean coal technology. Coal is a homogeneous mixture which consists of various components with different dielectric properties, so their abilities to absorb microwaves are different, and the sulfur-containing components are better absorbers of microwave, which makes them can be selectively heated and reacted under microwave irradiation. There still remain controversies on the principle of microwave desulfurization at present, thermal effects or non-thermal effects. The point of thermal effects of microwave is mainly base on its characters of rapidly and selectly heating. While, in view of non-thermal effect, direct interactions between the microwave electromagnetic field and sulfur containing components are proposed. It is a fundamental problem to determine the dielectric properties of coal and the sulfur-containing components to reveal the interaction of microwave and sulfur-containing compounds. However, the test of dielectric property of coal is affected by many factors, which makes it difficult to measure dielectric properties accurately. In order to achieve better desulfurization effect, the researchers employ methods of adding chemical additives such as acid, alkali, oxidant, reductant, or changing the reaction atmosphere, or combining with other methods such as magnetic separation, ultrasonic and microorganism. Researchers in this field have also put forward several processes, and have obtained a number of patents. Obscurity of microwave desulfurization mechanism, uncertainties in qualitative and quantitative analysis of sulfur-containing functional groups in coal, and the lack of special microwave equipment have limited further development of microwave desulfurization technology.

  2. Severe accident analysis using dynamic accident progression event trees

    Science.gov (United States)

    Hakobyan, Aram P.

    In present, the development and analysis of Accident Progression Event Trees (APETs) are performed in a manner that is computationally time consuming, difficult to reproduce and also can be phenomenologically inconsistent. One of the principal deficiencies lies in the static nature of conventional APETs. In the conventional event tree techniques, the sequence of events is pre-determined in a fixed order based on the expert judgments. The main objective of this PhD dissertation was to develop a software tool (ADAPT) for automated APET generation using the concept of dynamic event trees. As implied by the name, in dynamic event trees the order and timing of events are determined by the progression of the accident. The tool determines the branching times from a severe accident analysis code based on user specified criteria for branching. It assigns user specified probabilities to every branch, tracks the total branch probability, and truncates branches based on the given pruning/truncation rules to avoid an unmanageable number of scenarios. The function of a dynamic APET developed includes prediction of the conditions, timing, and location of containment failure or bypass leading to the release of radioactive material, and calculation of probabilities of those failures. Thus, scenarios that can potentially lead to early containment failure or bypass, such as through accident induced failure of steam generator tubes, are of particular interest. Also, the work is focused on treatment of uncertainties in severe accident phenomena such as creep rupture of major RCS components, hydrogen burn, containment failure, timing of power recovery, etc. Although the ADAPT methodology (Analysis of Dynamic Accident Progression Trees) could be applied to any severe accident analysis code, in this dissertation the approach is demonstrated by applying it to the MELCOR code [1]. A case study is presented involving station blackout with the loss of auxiliary feedwater system for a

  3. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  4. Clinical evaluation of techniques used in the surgical treatment of progressive hemifacial atrophy

    NARCIS (Netherlands)

    R. Roddi (Roberto); E. Riggio (Egidio); P.M. Gilbert (Philip); S.E.R. Hovius (Steven); J. Michiel Vaandrager (J.); J.C.H.M. van der Meulen (Jacques)

    1994-01-01

    textabstractWe critically review 13 patients with progressive hemifacial atrophy treated with three basic surgical procedures (free flap transplantation, alloplastic implants, micro-fat injections ‘lipofilling’) and further ancillary techniques. In spite of the satisfactory results achieved with the

  5. Progress in phototaxis mechanism research and micromanipulation techniques of algae cells

    Institute of Scientific and Technical Information of China (English)

    WEN Chenglu; LI Heng; WANG Pengbo; LI Wei; ZHAO Jingquan

    2007-01-01

    Phototactic movement is a characteristic of some microorganisms' response to light environment. Most of the algae have dramatically phototactic responses, underlying the complicated biological, physical and photochemical mechanisms are involved. With the development of the micro/nano and sensor techniques, great progress has been made in the research of the algae phototaxis. This review article summarizes the progress made in the research on the functional phototactic structures, the mechanisms of photo-response process and photodynamics of phototaxis in algae, and describes the latest developed micro-tracking technique and micromanipulation technique.Moreover, based on our own research results, the potential correlation between the phototaxis and photosynthesis is discussed, and the directions for future research of the phototactic mechanism are proposed.

  6. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  7. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  8. Root Cause Analysis - A Diagnostic Failure Analysis Technique for Managers

    Science.gov (United States)

    1975-03-26

    AA~ TECHNICAL REPORT RF-75-2 yAbom 0 ROOT CAUSE ANALYSIS - A DIAGNOSTIC FAILURE ANALYSIS TECHNIQUE FOR MANAGERS Augustine E. Magistro Nuclear...through 1975. rB Augustine E. Magistro has participated in root cause analysis task tem including team member and Blue Ribbon A panel reviewer, team

  9. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  10. Important progress on the use of isotope techniques and methods in catchment hydrology

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The use of isotope techniques and methods in catchment hydrology in the last 50 years has generated two major types of progress: (1) Assessment of the temporal variations of the major stocks and flows of water in catchments, from which the estimation of wa-ter residence times is introduced in this paper. (2) Assessment of catchment hydrologic processes, in which the interactions be-tween different waters, hydrographical separation, and bio-geochemical process are described by using isotopes tracers. Future progress on isotope techniques and methods in hydrology is toward the understanding of the hydrological process in large river basins. Much potential also waits realization in terms of how isotope information may be used to calibrate and test distributed rainfall-runoff models and regarding aid in the quantification of sustainable water resources management.

  11. Progress Testing: Critical Analysis and Suggested Practices

    Science.gov (United States)

    Albanese, Mark; Case, Susan M.

    2016-01-01

    Educators have long lamented the tendency of students to engage in rote memorization in preparation for tests rather than engaging in deep learning where they attempt to gain meaning from their studies. Rote memorization driven by objective exams has been termed a steering effect. Progress testing (PT), in which a comprehensive examination…

  12. UPLC: a preeminent technique in pharmaceutical analysis.

    Science.gov (United States)

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  13. A Comparative Analysis of Biomarker Selection Techniques

    Directory of Open Access Journals (Sweden)

    Nicoletta Dessì

    2013-01-01

    Full Text Available Feature selection has become the essential step in biomarker discovery from high-dimensional genomics data. It is recognized that different feature selection techniques may result in different set of biomarkers, that is, different groups of genes highly correlated to a given pathological condition, but few direct comparisons exist which quantify these differences in a systematic way. In this paper, we propose a general methodology for comparing the outcomes of different selection techniques in the context of biomarker discovery. The comparison is carried out along two dimensions: (i measuring the similarity/dissimilarity of selected gene sets; (ii evaluating the implications of these differences in terms of both predictive performance and stability of selected gene sets. As a case study, we considered three benchmarks deriving from DNA microarray experiments and conducted a comparative analysis among eight selection methods, representatives of different classes of feature selection techniques. Our results show that the proposed approach can provide useful insight about the pattern of agreement of biomarker discovery techniques.

  14. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  15. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  16. Progressive collapse analysis using updated models for alternate path analysis after a blast

    Science.gov (United States)

    Eskew, Edward; Jang, Shinae; Bertolaccini, Kelly

    2016-04-01

    Progressive collapse is of rising importance within the structural engineering community due to several recent cases. The alternate path method is a design technique to determine the ability of a structure to sustain the loss of a critical element, or elements, and still resist progressive collapse. However, the alternate path method only considers the removal of the critical elements. In the event of a blast, significant damage may occur to nearby members not included in the alternate path design scenarios. To achieve an accurate assessment of the current condition of the structure after a blast or other extreme event, it may be necessary to reduce the strength or remove additional elements beyond the critical members designated in the alternate path design method. In this paper, a rapid model updating technique utilizing vibration measurements is used to update the structural model to represent the real-time condition of the structure after a blast occurs. Based upon the updated model, damaged elements will either have their strength reduced, or will be removed from the simulation. The alternate path analysis will then be performed, but only utilizing the updated structural model instead of numerous scenarios. After the analysis, the simulated response from the analysis will be compared to failure conditions to determine the buildings post-event condition. This method has the ability to incorporate damage to noncritical members into the analysis. This paper will utilize numerical simulations based upon a unified facilities criteria (UFC) example structure subjected to an equivalent blast to validate the methodology.

  17. Comparative Analysis of Hand Gesture Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Arpana K. Patel

    2015-03-01

    Full Text Available During past few years, human hand gesture for interaction with computing devices has continues to be active area of research. In this paper survey of hand gesture recognition is provided. Hand Gesture Recognition is contained three stages: Pre-processing, Feature Extraction or matching and Classification or recognition. Each stage contains different methods and techniques. In this paper define small description of different methods used for hand gesture recognition in existing system with comparative analysis of all method with its benefits and drawbacks are provided.

  18. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  19. Analysis of breast cancer progression using principal component analysis and clustering

    Indian Academy of Sciences (India)

    G Alexe; G S Dalgin; S Ganesan; C DeLisi; G Bhanot

    2007-08-01

    We develop a new technique to analyse microarray data which uses a combination of principal components analysis and consensus ensemble -clustering to find robust clusters and gene markers in the data. We apply our method to a public microarray breast cancer dataset which has expression levels of genes in normal samples as well as in three pathological stages of disease; namely, atypical ductal hyperplasia or ADH, ductal carcinoma in situ or DCIS and invasive ductal carcinoma or IDC. Our method averages over clustering techniques and data perturbation to find stable, robust clusters and gene markers. We identify the clusters and their pathways with distinct subtypes of breast cancer (Luminal, Basal and Her2+). We confirm that the cancer phenotype develops early (in early hyperplasia or ADH stage) and find from our analysis that each subtype progresses from ADH to DCIS to IDC along its own specific pathway, as if each was a distinct disease.

  20. Economic Analysis on the Technique Progress of Chinese Herbal Medicine Biotechnology Industry in Taiwan Province%台湾中草药生技产业技术进步之经济分析

    Institute of Scientific and Technical Information of China (English)

    施正屏; 洪永裕

    2011-01-01

    The Chinese herbal medicine biotechnology industry in Taiwan province contains four parts, namely biotechnology and new pharmaceutical, traditional pharmaceutical, health food, and traditional food. Despite energetic support and active promotion from Taiwan province, the Chinese herbal medicine market and industry development make little progress and lack in a clear development strategy. In order to find out the current status and restricting factors of Chinese herbal medicine biotechnology industry in Taiwan province, to facilitate Taiwan agricultural development to advance toward high--new agriculture, and to alleviate the problems of agriculture, countryside and farmers and enhance people's health, this article attempts to establish econometric model for estimating the main products, main production factors, output price elasticity, supply and demand price elasticity, technological trends, and so on. The study also aims to determine the direction of industry and technology development and give corresponding proposals concerning the aspects of supply, demand and technology.%台湾中草药生技产业包含四大产业分别是生技新药、传统制药、保健食品、传统食品,其间虽历经近30年台湾的大力扶持与积极推动,中草药之需求及产业发展仍然迟滞,而且缺乏明确的发展战略。为充分掌握目前台湾中草药生技产业发展现况及制约因素,推动台湾农业迈向高新农业发展,进而缓和三农问题并促进全民健康,该研究拟建立计量经济模型,估计出台湾中草药生技产业之主要产品及其所使用之主要生产要素之产出价格弹性、供给与需求价格弹性、技术倾向等,确认产业及技术发展方向,就供给、需求及技术等面向提出具体发展之建议。

  1. The Progress of Neutron Induced Prompt Gamma Analysis Technique in 1988~2002%中子感生瞬发γ射线在线分析技术的进展(1988~2002)

    Institute of Scientific and Technical Information of China (English)

    刘雨人; 景士伟

    2003-01-01

    对1988~2002年间中子感生瞬发γ射线在线分析(Neutron Induced Prompt Gamma-ray Analysis,NIPGA)技术的发展进行了叙述.目前分析技术已进入脉冲快热中子的非弹反应和俘获反应联合分析(Pulsed Fast-Thermal Neutrons Analysis,PFTNA)阶段.采用的中子发生器寿命已突破10 000 h,多道谱仪已发展成全数字化整体式现场用谱仪,HPGe探测器的相对探测效率达100%,常温高分辨的TeZnCd探测器已开始应用.在软件方面已采用蒙特卡罗-库最小二乘法解决了中子活化瞬发γ射线分析(Prompt Gamma Neutron Activation Analysis,PGNAA)中的非线性反转问题.

  2. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  3. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  4. Systems Analysis Department annual progress report 1998

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    1999-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability, Man/Machine Interaction and Technology Scenarios. The report includes lists of publications, lectures, committees and staff members. (au) 111 refs.

  5. Systems Analysis Department. Annual Progress Report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    2000-03-01

    This report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1999. The department is undertaking research within Energy Systems Analysis, Energy, Environment and Development Planning-UNEP Centre, Safety, Reliability and Human Factors, and Technology Scenarios. The report includes summary statistics and lists of publications, committees and staff members. (au)

  6. Systems Analysis department. Annual progress report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Petersen, Kurt E.

    1998-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1997. The department is undertaking research within Energy systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability and Man/Machine Interaction. The report includes lists of publications lectures, committees and staff members. (au) 110 refs.

  7. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  8. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  9. Progress in CTEQ-TEA PDF analysis

    CERN Document Server

    Nadolsky, Pavel; Guzzi, Marco; Huston, Joey; Lai, Hung-Liang; Li, Zhao; Pumplin, Jon; Stump, Dan; Yuan, C -P

    2012-01-01

    Recent developments in the CTEQ-TEA global QCD analysis are presented. The parton distribution functions CT10-NNLO are described, constructed by comparing data from many experiments to NNLO approximations of QCD.

  10. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  11. Randomization techniques for the intensity modulation-based quantum stream cipher and progress of experiment

    Science.gov (United States)

    Kato, Kentaro; Hirota, Osamu

    2011-08-01

    The quantum noise based direct encryption protocol Y-OO is expected to provide physical complexity based security, which is thought to be comparable to information theoretic security in mathematical cryptography, for the. physical layer of fiber-optic communication systems. So far, several randomization techniques for the quantum stream cipher by Y-OO protocol have been proposed, but most of them were developed under the assumption that phase shift keying is used as the modulation format. On the other hand, the recent progress in the experimental study on the intensity modulation based quantum stream cipher by Y-OO protocol raises expectations for its realization. The purpose of this paper is to present design and implementation methods of a composite model of the intensity modulation based quantum stream cipher with some randomization techniques. As a result this paper gives a viewpoint of how the Y-OO cryptosystem is miniaturized.

  12. Development and application of the electrochemical etching technique. Annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    1980-08-01

    This annual progress report documents further advances in the development and application of electrochemical etching of polycarbonate foils (ECEPF) for fast, intermediate, and thermal neutron dosimetry as well as alpha particle dosimetry. The fast (> 1.1 MeV) and thermal neutron dosimetry techniques were applied to a thorough investigation of the neutron contamination inherent in and about the primary x-ray beam of several medical therapy electron accelerators. Because of the small size of ECEPF dosimeters in comparison to other neutron meters, they have an unusually low perturbation of the radiation field under measurement. Due to this small size and the increased sensitivity of the ECEPF dosimeter over current techniques of measuring neutrons in a high photon field, the fast neutron contamination in the primary x-ray beam of all the investigated accelerators was measured with precision and found to be greater than that suggested by the other, more common, neutron dosimetry methods.

  13. Analytical techniques in pharmaceutical analysis: A review

    Directory of Open Access Journals (Sweden)

    Masoom Raza Siddiqui

    2017-02-01

    Full Text Available The development of the pharmaceuticals brought a revolution in human health. These pharmaceuticals would serve their intent only if they are free from impurities and are administered in an appropriate amount. To make drugs serve their purpose various chemical and instrumental methods were developed at regular intervals which are involved in the estimation of drugs. These pharmaceuticals may develop impurities at various stages of their development, transportation and storage which makes the pharmaceutical risky to be administered thus they must be detected and quantitated. For this analytical instrumentation and methods play an important role. This review highlights the role of the analytical instrumentation and the analytical methods in assessing the quality of the drugs. The review highlights a variety of analytical techniques such as titrimetric, chromatographic, spectroscopic, electrophoretic, and electrochemical and their corresponding methods that have been applied in the analysis of pharmaceuticals.

  14. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  15. Novel Technique of Transepithelial Corneal Cross-Linking Using Iontophoresis in Progressive Keratoconus

    Science.gov (United States)

    Raffa, Paolo; Rosati, Marianna

    2016-01-01

    In this work, the authors presented the techniques and the preliminary results at 6 months of a randomized controlled trial (NCT02117999) comparing a novel transepithelial corneal cross-linking protocol using iontophoresis with the Dresden protocol for the treatment of progressive keratoconus. At 6 months, there was a significant average improvement with an average flattening of the maximum simulated keratometry reading of 0.72 ± 1.20 D (P = 0.01); in addition, corrected distance visual acuity improved significantly (P = 0.08) and spherical equivalent refraction was significantly less myopic (P = 0.02) 6 months after transepithelial corneal cross-linking with iontophoresis. The novel protocol using iontophoresis showed comparable results with standard corneal cross-linking to halt progression of keratoconus during 6-month follow-up. Investigation of the long-term RCT outcomes are ongoing to verify the efficacy of this transepithelial corneal cross-linking protocol and to determine if it may be comparable with standard corneal cross-linking in the management of progressive keratoconus. PMID:27597895

  16. Novel Technique of Transepithelial Corneal Cross-Linking Using Iontophoresis in Progressive Keratoconus

    Directory of Open Access Journals (Sweden)

    Marco Lombardo

    2016-01-01

    Full Text Available In this work, the authors presented the techniques and the preliminary results at 6 months of a randomized controlled trial (NCT02117999 comparing a novel transepithelial corneal cross-linking protocol using iontophoresis with the Dresden protocol for the treatment of progressive keratoconus. At 6 months, there was a significant average improvement with an average flattening of the maximum simulated keratometry reading of 0.72±1.20 D (P=0.01; in addition, corrected distance visual acuity improved significantly (P=0.08 and spherical equivalent refraction was significantly less myopic (P=0.02 6 months after transepithelial corneal cross-linking with iontophoresis. The novel protocol using iontophoresis showed comparable results with standard corneal cross-linking to halt progression of keratoconus during 6-month follow-up. Investigation of the long-term RCT outcomes are ongoing to verify the efficacy of this transepithelial corneal cross-linking protocol and to determine if it may be comparable with standard corneal cross-linking in the management of progressive keratoconus.

  17. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  18. Managing the Classroom with Technology. On Progress Reports and Online Communications, and How To Manage the Two Different Communication Techniques.

    Science.gov (United States)

    Kasprowicz, Tim

    2002-01-01

    Describes how one teacher bridged the communications gap among teachers, parents, and students through the use of technology in managing his classroom. Discusses progress reports and online communications and how to manage the two different communication techniques. (JOW)

  19. Systems Analysis Department. Annual progress report 1996

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, H.; Olsson, C.; Petersen, K.E. [eds.

    1997-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1996. The department is undertaking research within Simulation and Optimisation of Energy Systems, Energy and Environment in Developing Countries - UNEP Centre, Integrated Environmental and Risk Management and Man/Machine Interaction. The report includes lists of publications, lectures, committees and staff members. (au) 131 refs.

  20. Dynamics and vibrations progress in nonlinear analysis

    CERN Document Server

    Kachapi, Seyed Habibollah Hashemi

    2014-01-01

    Dynamical and vibratory systems are basically an application of mathematics and applied sciences to the solution of real world problems. Before being able to solve real world problems, it is necessary to carefully study dynamical and vibratory systems and solve all available problems in case of linear and nonlinear equations using analytical and numerical methods. It is of great importance to study nonlinearity in dynamics and vibration; because almost all applied processes act nonlinearly, and on the other hand, nonlinear analysis of complex systems is one of the most important and complicated tasks, especially in engineering and applied sciences problems. There are probably a handful of books on nonlinear dynamics and vibrations analysis. Some of these books are written at a fundamental level that may not meet ambitious engineering program requirements. Others are specialized in certain fields of oscillatory systems, including modeling and simulations. In this book, we attempt to strike a balance between th...

  1. Progress on the CWU READI Analysis Center

    Science.gov (United States)

    Melbourne, T. I.; Szeliga, W. M.; Santillan, V. M.; Scrivner, C.

    2015-12-01

    Real-time GPS position streams are desirable for a variety of seismic monitoring and hazard mitigation applications. We report on progress in our development of a comprehensive real-time GPS-based seismic monitoring system for the Cascadia subduction zone. This system is based on 1 Hz point position estimates computed in the ITRF08 reference frame. Convergence from phase and range observables to point position estimates is accelerated using a Kalman filter based, on-line stream editor that produces independent estimations of carrier phase integer biases and other parameters. Positions are then estimated using a short-arc approach and algorithms from JPL's GIPSY-OASIS software with satellite clock and orbit products from the International GNSS Service (IGS). The resulting positions show typical RMS scatter of 2.5 cm in the horizontal and 5 cm in the vertical with latencies below 2 seconds. To facilitate the use of these point position streams for applications such as seismic monitoring, we broadcast real-time positions and covariances using custom-built aggregation-distribution software based on RabbitMQ messaging platform. This software is capable of buffering 24-hour streams for hundreds of stations and providing them through a REST-ful web interface. To demonstrate the power of this approach, we have developed a Java-based front-end that provides a real-time visual display of time-series, displacement vector fields, and map-view, contoured, peak ground displacement. This Java-based front-end is available for download through the PANGA website. We are currently analyzing 80 PBO and PANGA stations along the Cascadia margin and gearing up to process all 400+ real-time stations that are operating in the Pacific Northwest, many of which are currently telemetered in real-time to CWU. These will serve as milestones towards our over-arching goal of extending our processing to include all of the available real-time streams from the Pacific rim. In addition, we have

  2. Attitude Exploration Using Factor Analysis Technique

    Directory of Open Access Journals (Sweden)

    Monika Raghuvanshi

    2016-12-01

    Full Text Available Attitude is a psychological variable that contains positive or negative evaluation about people or an environment. The growing generation possesses learning skills, so if positive attitude is inculcated at the right age, it might therefore become habitual. Students in the age group 14-20 years from the city of Bikaner, India, are the target population for this study. An inventory of 30Likert-type scale statements was prepared in order to measure attitude towards the environment and matters related to conservation. The primary data is collected though a structured questionnaire, using cluster sampling technique and analyzed using the IBM SPSS 23 statistical tool. Factor analysis is used to reduce 30 variables to a smaller number of more identifiable groups of variables. Results show that students “need more regulation and voluntary participation to protect the environment”, “need conservation of water and electricity”, “are concerned for undue wastage of water”, “need visible actions to protect the environment”, “need strengthening of the public transport system”, “are a little bit ignorant about the consequences of global warming”, “want prevention of water pollution by industries”, “need changing of personal habits to protect the environment”, and “don’t have firsthand experience of global warming”. Analysis revealed that nine factors obtained could explain about 58.5% variance in the attitude of secondary school students towards the environment in the city of Bikaner, India. The remaining 39.6% variance is attributed to other elements not explained by this analysis. A global campaign for improvement in attitude about environmental issues and its utility in daily lives may boost positive youth attitudes, potentially impacting worldwide. A cross-disciplinary approach may be developed by teaching along with other related disciplines such as science, economics, and social studies etc.

  3. Numerical Analysis of Structural Progressive Collapse to Blast Loads

    Institute of Scientific and Technical Information of China (English)

    HAO Hong; WU Chengqing; LI Zhongxian; ABDULLAH A K

    2006-01-01

    After the progressive collapse of Ronan Point apartment in UK in 1968,intensive research effort had been spent on developing guidelines for design of new or strengthening the existing structures to prevent progressive collapse.However,only very few building design codes provide some rather general guidance,no detailed design requirement is given.Progressive collapse of the Alfred P.Murrah Federal building in Oklahoma City and the World Trade Centre (WTC) sparked again tremendous research interest on progressive collapse of structures.Recently,US Department of Defence (DoD) and US General Service Administration (GSA) issued guidelines for structure progressive collapse analysis.These two guidelines are most commonly used,but their accuracy is not known.This paper presents numerical analysis of progressive collapse of an example frame structure to blast loads.The DoD and GSA procedures are also used to analyse the same example structure.Numerical results are compared and discussed.The accuracy and the applicability of the two design guidelines are evaluated.

  4. Risk factors for progressive ischemic stroke A retrospective analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    BACKGROUND: Progressive ischemic stroke has higher fatality rate and disability rate than common cerebral infarction, thus it is very significant to investigate the early predicting factors related to the occurrence of progressive ischemic stroke, thc potential pathological mechanism and the risk factors of early intervention for preventing the occurrence of progressive ischemic stroke and ameliorating its outcome.OBJECTIVE: To analyze the possible related risk factors in patients with progressive ishcemic stroke, so as to provide reference for the prevention and treatment of progressive ishcemic stroke.DESIGN: A retrospective analysis.SETTING: Department of Neurology, General Hospital of Beijing Coal Mining Group.PARTICIPANTS: Totally 280 patients with progressive ischemic stroke were selected from the Department of Neurology, General Hospital of Beijing Coal Mining Group from March 2002 to June 2006, including 192 males and 88 females, with a mean age of (62±7) years old. They were all accorded with the diagnostic standards for cerebral infarction set by the Fourth National Academic Meeting for Cerebrovascular Disease in 1995, and confired by CT or MRI, admitted within 24 hours after attack, and the neurological defect progressed gradually or aggravated in gradients within 72 hours after attack, and the aggravation of neurological defect was defined as the neurological deficit score decreased by more than 2 points. Meanwhile,200 inpatients with non-progressive ischemic stroke (135 males and 65 females) were selected as the control group.METHODS: After admission, a univariate analysis of variance was conducted using the factors of blood pressure, history of diabetes mellitus, fever, leukocytosis, levels of blood lipids, fibrinogen, blood glucose and plasma homocysteine, cerebral arterial stenosis, and CT symptoms of early infarction, and the significant factors were involved in the multivariate non-conditional Logistic regression analysis.MAIN OUTCOME MEASURES

  5. Organic analysis progress report FY 1997

    Energy Technology Data Exchange (ETDEWEB)

    Clauss, S.A.; Grant, K.E.; Hoopes, V.; Mong, G.M.; Steele, R.; Bellofatto, D.; Sharma, A.

    1998-04-01

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples.

  6. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  7. Progress in neutron activation analysis for uranium

    Institute of Scientific and Technical Information of China (English)

    杜鸿善; 李贵群; 董桂芝; 李俊兰; K.H.Chiu; C.M.Wai

    1996-01-01

    A new type of extractant, sym-dibenzo-16-crown-5-oxyhydroxamic acid (HL) is introduced. The extractions of UO22+, Na+, K+, Sr2+, Ba2+ and Br- were studied with HL in chloroform. The results obtained show that UO22+ can be quantitatively extracted at pH values above 5, whereas the extractions of K+, Na+, Sr2+, Ba2+ and Br- are negligible in the pH range of 2 - 7. The dependence of the distribution ratio of U(VI) on both the concentration of the HL and pH are linear, and they have the same slope of 2. This suggests that U(VI) appears to form a 1:2 complex with ligand. Uranium(VI) can be selectively separated and concentrated from interfering elements such as Na, K, Sr and Br by solvent extraction with HL under specific conditions. The recovery of uranium is nearly 100% and the radionudear purity of uranium is greater than 99.99%. Therefore, neutron activation analysis has greatly improved the sensitivity and accuracy for the detection of trace uranium from seawater.

  8. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  9. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  10. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  11. Progress of Space Charge Research on Oil-Paper Insulation Using Pulsed Electroacoustic Techniques

    Directory of Open Access Journals (Sweden)

    Chao Tang

    2016-01-01

    Full Text Available This paper focuses on the space charge behavior in oil-paper insulation systems used in power transformers. It begins with the importance of understanding the space charge behavior in oil-paper insulation systems, followed by the introduction of the pulsed electrostatic technique (PEA. After that, the research progress on the space charge behavior of oil-paper insulation during the recent twenty years is critically reviewed. Some important aspects such as the environmental conditions and the acoustic wave recovery need to be addressed to acquire more accurate space charge measurement results. Some breakthroughs on the space charge behavior of oil-paper insulation materials by the research team at the University of Southampton are presented. Finally, future work on space charge measurement of oil-paper insulation materials is proposed.

  12. Recent progress in the melt-process technique of high-temperature superconductors

    CERN Document Server

    Ikuta, H; Mizutani, U

    1999-01-01

    Recently, the performance of high-temperature super conductors prepared by the melt-process technique has been greatly improved. This progress was accomplished by the addition of Ag into the starting materials of the Sm-Ba-CuO $9 system, which prevents the formation of severe macro-sized cracks in the finished samples. The magnetic flux density trapped by this material has now reached 9 T at 25 K, which is comparable to the magnetic flux density produced by $9 ordinary superconducting magnets. The amount of magnetic flux density that can be trapped by the sample is limited by the mechanical strength rather than superconducting properties of the material. The increase in the mechanical $9 strength of the material is important both for further improvement of the material properties and for ensuring reliability of the material in practical applications. (20 refs).

  13. Progress in the biosensing techniques for trace-level heavy metals.

    Science.gov (United States)

    Mehta, Jyotsana; Bhardwaj, Sanjeev K; Bhardwaj, Neha; Paul, A K; Kumar, Pawan; Kim, Ki-Hyun; Deep, Akash

    2016-01-01

    Diverse classes of sensors have been developed over the past few decades for on-site detections of heavy metals. Most of these sensor systems have exploited optical, electrochemical, piezoelectric, ion-selective (electrode), and electrochemical measurement techniques. As such, numerous efforts have been made to explore the role of biosensors in the detection of heavy metals based on well-known interactions between heavy metals and biomolecules (e.g. proteins, peptides, enzymes, antibodies, whole cells, and nucleic acids). In this review, we cover the recent progress made on different types of biosensors for the detection of heavy metals. Our major focus was examining the use of biomolecules for constructing these biosensors. The discussion is extended further to cover the biosensors' performance along with challenges and opportunities for practical utilization.

  14. Silicon ribbon growth by a capillary action shaping technique. Quarterly technical progress report No. 2

    Energy Technology Data Exchange (ETDEWEB)

    Schwuttke, G.H.; Ciszek, T.F.; Kran, A.

    1975-01-01

    Progress during the second quarter of the contractual effort is described. The work performed related mainly to ribbon growth by a capillary action shaping technique and to ribbon characterization. Actual progress in the crystal growth area includes the evaluation of 10 potential die materials other than carbon and the process development for 25-mm-wide ribon. From the die study it is concluded that boron carbide, silicon carbide, and silicon nitride may warrant further investigation as die materials. Process development for 25-mm ribbon growth resulted in ribbons of superior surface quality. Potential ribbongrowth problems encountered and discussed include a boron doping anomaly and frozen-in stresses in ribbons. The characterization effort concentrated on the development of a solar-cell process to be used for ribbon characterization. Material requirements and detailed process procedures are given. Solar cells fabricated by this process are compared with commercially available solar cells and compare favorably. A transmission electron microscopy study of planar boundaries frequently observed in ribbon crystals is reported. (auth)

  15. The critical barrier to progress in dentine bonding with the etch-and-rinse technique

    Science.gov (United States)

    Brackett, M.G.; Li, N.; Brackett, W.W.; Sword, R.J.; Qi, Y.P.; Niu, L.N.; Pucci, C.R.; Dib, A.; Pashley, D.H.; Tay, F.R.

    2011-01-01

    Objectives The lack of durability in resin–dentine bonds led to the use of chlorhexidine as MMP-inhibitor to prevent the degradation of hybrid layers. Biomimetic remineralisation is a concept-proven approach in preventing the degradation of resin–dentine bonds. The purpose of this study is to examine the integrity of aged resin–dentine interfaces created with a nanofiller-containing etch-and-rinse adhesive after the application of these two approaches. Methods The more established MMP-inhibition approach was examined using a parallel in vivo and in vitro ageing design to facilitate comparison with the biomimetic remineralisation approach using an in vitro ageing design. Specimens bonded without chlorhexidine exhibited extensive degradation of the hybrid layer after 12 months of in vivo ageing. Results Dissolution of nanofillers could be seen within a water-rich zone within the adhesive layer. Although specimens bonded with chlorhexidine exhibited intact hybrid layers, water-rich regions remained in those hybrid layers and degradation of nanofillers occurred within the adhesive layer. Specimens subjected to in vitro biomimetic remineralisation followed by in vitro ageing demonstrated intrafibrillar collagen remineralisation within hybrid layers and deposition of mineral nanocrystals in nanovoids within the adhesive. Conclusions The impact was realized by understanding the lack of an inherent mechanism to remove water from resin–dentine interfaces as the critical barrier to progress in bonding with the etch-and-rinse technique. The experimental biomimetic remineralisation strategy offers a creative solution for incorporating a progressive hydration mechanism to achieve this goal, which warrants its translation into a clinically applicable technique. PMID:21215788

  16. Progression of Stellar Intensity Interferometry techniques using 3 meter telescopes at StarBase-Utah

    Science.gov (United States)

    Matthews, Nolan; Kieda, Dave; Lebohec, Stephan

    2015-04-01

    The emergence of large air Cherenkov telescope arrays have opened up the potential for high-resolution imaging of stellar surfaces using Intensity Interferometry techniques. Stellar Intensity Interferometry (SII) allows coverage into the optical and ultraviolet frequency bands which are traditionally inaccessible to classical Michelson interferometry. The relative insensitivity to atmospheric turbulence allows for unprecedented angular resolution scales as the baselines between telescopes can be made very large (>100m) without the need for precise spatial resolution as required by Michelson interferometry. In this talk I will illustrate the science capabilities of the SII technique and describe the progress achieved in developing a modern Stellar Intensity Interferometry system with a pair of 3 meter diameter optical telescopes located at StarBase-Utah. In particular, I will discuss the current status of the StarBase-Utah observatory and present results from two telescope low frequency optical correlation observations of the optical Crab pulsar. These measurements provide a first step towards actual intensity interferometry observations and establish the working condition of the StarBase-Utah telescopes.

  17. [Research progress on urban carbon fluxes based on eddy covariance technique].

    Science.gov (United States)

    Liu, Min; Fu, Yu-Ling; Yang, Fang

    2014-02-01

    Land use change and fossil fuel consumption due to urbanization have made significant effect on global carbon cycle and climate change. Accurate estimating and understanding of the carbon budget and its characteristics are the premises for studying carbon cycle and its driving mechanisms in urban system. Based on the theory of eddy covariance (EC) technique, the characteristics atmospheric boundary layer and carbon cycle in urban area, this study systematically reviewed the principles of CO2 flux monitoring in urban system with EC technique, and then summarized the problems faced in urban CO2 flux monitoring and the method for data processing and further assessment. The main research processes on urban carbon fluxes with EC technique were also illustrated. The results showed that the urban surface was mostly acting as net carbon source. The CO2 exchange between urban surface and atmosphere showed obvious diurnal, weekly and seasonal variation resulted from the vehicle exhaust, domestic heating and vegetation respiration. However, there still exist great uncertainties in urban flux measurement and its explanation due to high spatial heterogeneity and complex distributions of carbon source/sink in urban environments. In the end, we suggested that further researches on EC technique and data assessment in complex urban area should be strengthened. It was also requisite to develop models of urban carbon cycle on the basis of the system principle, to investigate the influencing mechanism and variability of urban cycle at regional scale with spatial analysis technique.

  18. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  19. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    Science.gov (United States)

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison.

  20. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  1. Classification Techniques for Multivariate Data Analysis.

    Science.gov (United States)

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  2. Trends and Techniques in Visual Gaze Analysis

    CERN Document Server

    Stellmach, Sophie; Dachselt, Raimund; Lindley, Craig A

    2010-01-01

    Visualizing gaze data is an effective way for the quick interpretation of eye tracking results. This paper presents a study investigation benefits and limitations of visual gaze analysis among eye tracking professionals and researchers. The results were used to create a tool for visual gaze analysis within a Master's project.

  3. Analysis of Gopher Tortoise Population Estimation Techniques

    Science.gov (United States)

    2005-10-01

    terrestrial reptile that was once found throughout the southeastern United States from North Carolina into Texas. However, due to numerous factors...et al. 2000, Waddle 2000). Solar energy is used for thermoregulation and egg incubation. Also, tortoises are grazers (Garner and Landers 1981...Evaluation and review of field techniques used to study and manage gopher tortoises.” Pages 205-215 in Management of amphibians, reptiles , and small mammals

  4. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Mirizzi, F. [Associazione EURATOM-ENEA sulla Fusione, Consorzio CREATE, Università degli Studi di Napoli Federico II, Via Claudio 21, 80125, Napoli (Italy)

    2014-02-12

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper.

  5. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    Science.gov (United States)

    Mirizzi, F.

    2014-02-01

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper.

  6. Multiuser detection and independent component analysis-Progress and perspective

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The latest progress in the multiuser detection and independent component analysis (ICA) is reviewed systematically. Then two novel classes of multiuser detection methods based on ICA algorithms and feedforward neural networks are proposed. Theoretical analysis and computer simulation show that ICA algorithms are effective to detect multiuser signals in code-division multiple-access (CDMA) system. The performances of these methods are not identical entirely in various channels, but all of them are robust, efficient, fast and suitable for real-time implementations.

  7. [Research progresses of anabolic steroids analysis in doping control].

    Science.gov (United States)

    Long, Yuanyuan; Wang, Dingzhong; Li, Ke'an; Liu, Feng

    2008-07-01

    Anabolic steroids, a kind of physiological active substance, are widely abused to improve athletic performance in human sports. They have been forbidden in sports by the International Olympic Committee since 1983. Since then, many researchers have been focusing their attentions on the establishment of reliable detection methods. In this paper, we review the research progresses of different analytical methods for anabolic steroids since 2002, such as gas chromatography-mass spectrometry, liquid chromatography-mass spectrometry, immunoassay, electrochemistry analysis and mass spectrometry. The developing prospect of anabolic steroids analysis is also discussed.

  8. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  9. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  10. Uncertainty Analysis Technique for OMEGA Dante Measurements

    Energy Technology Data Exchange (ETDEWEB)

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  11. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  12. Cognitive task analysis: Techniques applied to airborne weapons training

    Energy Technology Data Exchange (ETDEWEB)

    Terranova, M.; Seamster, T.L.; Snyder, C.E.; Treitler, I.E. (Oak Ridge National Lab., TN (USA); Carlow Associates, Inc., Fairfax, VA (USA); Martin Marietta Energy Systems, Inc., Oak Ridge, TN (USA); Tennessee Univ., Knoxville, TN (USA))

    1989-01-01

    This is an introduction to cognitive task analysis as it may be used in Naval Air Systems Command (NAVAIR) training development. The focus of a cognitive task analysis is human knowledge, and its methods of analysis are those developed by cognitive psychologists. This paper explains the role that cognitive task analysis and presents the findings from a preliminary cognitive task analysis of airborne weapons operators. Cognitive task analysis is a collection of powerful techniques that are quantitative, computational, and rigorous. The techniques are currently not in wide use in the training community, so examples of this methodology are presented along with the results. 6 refs., 2 figs., 4 tabs.

  13. Comparison of Hydrogen Sulfide Analysis Techniques

    Science.gov (United States)

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  14. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  15. [Progress on detection and analysis method of endocrine disrupting compounds].

    Science.gov (United States)

    Du, Hui-Fang; Yan, Hui-Fang

    2005-07-01

    EDCs are new generation of environmental pollutions which are globally concerned. They may cause adverse effect mainly to the endocrine system and nervous system, etc. To assess the EDCs' hazard to the health exactly, we should know about the distribution and level of EDCs in the environment. In this paper, the technique of pretreatment in different matrices, the method of detection and analysis about EDCs were reviewed, and the future's prospect on the study of detection and analysis method were talked about also.

  16. A comparison of wavelet analysis techniques in digital holograms

    Science.gov (United States)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  17. Coding technique with progressive reconstruction based on VQ and entropy coding applied to medical images

    Science.gov (United States)

    Martin-Fernandez, Marcos; Alberola-Lopez, Carlos; Guerrero-Rodriguez, David; Ruiz-Alzola, Juan

    2000-12-01

    In this paper we propose a novel lossless coding scheme for medical images that allows the final user to switch between a lossy and a lossless mode. This is done by means of a progressive reconstruction philosophy (which can be interrupted at will) so we believe that our scheme gives a way to trade off between the accuracy needed for medical diagnosis and the information reduction needed for storage and transmission. We combine vector quantization, run-length bit plane and entropy coding. Specifically, the first step is a vector quantization procedure; the centroid codes are Huffman- coded making use of a set of probabilities that are calculated in the learning phase. The image is reconstructed at the coder in order to obtain the error image; this second image is divided in bit planes, which are then run-length and Huffman coded. A second statistical analysis is performed during the learning phase to obtain the parameters needed in this final stage. Our coder is currently trained for hand-radiographs and fetal echographies. We compare our results for this two types of images to classical results on bit plane coding and the JPEG standard. Our coder turns out to outperform both of them.

  18. An analysis technique for microstrip antennas

    Science.gov (United States)

    Agrawal, P. K.; Bailey, M. C.

    1977-01-01

    The paper presents a combined numerical and empirical approach to the analysis of microstrip antennas over a wide range of frequencies. The method involves representing the antenna by a fine wire grid immersed in a dielectric medium and then using Richmond's reaction formulation (1974) to evaluate the piecewise sinusoidal currents on the grid segments. The calculated results are then modified to account for the finite dielectric discontinuity. The method is applied to round and square microstrip antennas.

  19. Multivariate techniques of analysis for ToF-E recoil spectrometry data

    Energy Technology Data Exchange (ETDEWEB)

    Whitlow, H.J.; Bouanani, M.E.; Persson, L.; Hult, M.; Jonsson, P.; Johnston, P.N. [Lund Institute of Technology, Solvegatan, (Sweden), Department of Nuclear Physics; Andersson, M. [Uppsala Univ. (Sweden). Dept. of Organic Chemistry; Ostling, M.; Zaring, C. [Royal institute of Technology, Electrum, Kista, (Sweden), Department of Electronics; Johnston, P.N.; Bubb, I.F.; Walker, B.R.; Stannard, W.B. [Royal Melbourne Inst. of Tech., VIC (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Multivariate statistical methods are being developed by the Australian -Swedish Recoil Spectrometry Collaboration for quantitative analysis of the wealth of information in Time of Flight (ToF) and energy dispersive Recoil Spectrometry. An overview is presented of progress made in the use of multivariate techniques for energy calibration, separation of mass-overlapped signals and simulation of ToF-E data. 6 refs., 5 figs.

  20. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  1. Progressive Failure Analysis on the Single Lap Bonded Joints

    Directory of Open Access Journals (Sweden)

    Kadir TURAN

    2010-03-01

    Full Text Available In this study, the failure analysis on the single lap bonded joint, which is used for joined two composite plates each other with adhesive, is investigated experimentally and numerically. In the joint, the epoxy resin is used for adhesive and the four layered carbon fiber reinforced epoxy matrix resin composite plates are used for adherent. Numerical study is performed in the ANSYS software which is used finite element method for solution. For obtained numerical failure loads, the progressive failure analysis is used with material property degradation rules. In the failure analysis the Hashin Failure Criterion is used for composite plates and the Maximum Principal Stress failure criterion is used for adhesive. The effects of the adhesive thickness overlap lengths and plate weight on the joint strength is investigated with numerically. As a result it is seen that the failure loads is affected the bond face area. The results are presented with graphs and tables.

  2. Progress Toward the Analysis of the Kinetic Stabilizer Concept

    Energy Technology Data Exchange (ETDEWEB)

    Post, R F; Byers, J A; Cohen, R H; Fowler, T K; Ryutov, D D; Tung, L S

    2005-02-08

    The Kinetic Stabilizer (K-S) concept [1] represents a means for stabilizing axisymmetric mirror and tandem-mirror (T-M) magnetic fusion systems against MHD interchange instability modes. Magnetic fusion research has given us examples of axisymmetric mirror confinement devices in which radial transport rates approach the classical ''Spitzer'' level, i.e. situations in which turbulence if present at all, is at too low a level to adversely affect the radial transport [2,3,4]. If such a low-turbulence condition could be achieved in a T-M system it could lead to a fusion power system that would be simpler, smaller, and easier to develop than one based on closed-field confinement, e.g., the tokamak, where the transport is known to be dominated by turbulence. However, since conventional axisymmetric mirror systems suffer from the MHD interchange instability, the key to exploiting this new opportunity is to find a practical way to stabilize this mode. The K-S represents one avenue to achieving this goal. The starting point for the K-S concept is a theoretical analysis by Ryutov [5]. He showed that a MHD-unstable plasma contained in an axisymmetric mirror cell can be MHD-stabilized by the presence of a low-density plasma on the expanding field lines outside the mirrors. If this plasma communicates well electrically with the plasma in the then this exterior plasma can stabilize the interior, confined, plasma. This stabilization technique was conclusively demonstrated in the Gas Dynamic Trap (GDT) experiment [6] at Novosibirsk, Russia, at mirror-cell plasma beta values of 40 percent. The GDT operates in a high collisionality regime. Thus the effluent plasma leaking through the mirrors, though much lower in density than that of the confined plasma, is still high enough to satisfy the stabilization criterion. This would not, however, be the case in a fusion T-M with axisymmetric plug and central cell fields. In such a case the effluent plasma would be far

  3. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  4. Silicon ribbon growth by a capillary action shaping technique. Annual report (Quarterly technical progress report No. 9)

    Energy Technology Data Exchange (ETDEWEB)

    Schwuttke, G.H.; Ciszek, T.F.; Kran, A.

    1977-10-01

    Progress on the technological and economical assessment of ribbon growth of silicon by a capillary action shaping technique is reported. Progress in scale-up of the process from 50 mm to 100 mm ribbon widths is presented, the use of vitreous carbon as a crucible material is analyzed, and preliminary tests of CVD Si/sub 3/N/sub 4/ as a potential die material are reported. Diffusion length measurements by SEM, equipment and procedure for defect display under MOS structure in silicon ribbon for lifetime interpretation, and an assessment of ribbon technology are discussed. (WHK)

  5. New techniques for emulsion analysis in a hybrid experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodama, K. (Aichi University of Education, Kariya 448 (Japan)); Ushida, N. (Aichi University of Education, Kariya 448 (Japan)); Mokhtarani, A. (University of California (Davis), Davis, CA 95616 (United States)); Paolone, V.S. (University of California (Davis), Davis, CA 95616 (United States)); Volk, J.T. (University of California (Davis), Davis, CA 95616 (United States)); Wilcox, J.O. (University of California (Davis), Davis, CA 95616 (United States)); Yager, P.M. (University of California (Davis), Davis, CA 95616 (United States)); Edelstein, R.M. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Freyberger, A.P. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Gibaut, D.B. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Lipton, R.J. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Nichols, W.R. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Potter, D.M. (Carnegie-Mellon Univers

    1994-08-01

    A new method, called graphic scanning, was developed by the Nagoya University Group for emulsion analysis in a hybrid experiment. This method enhances both speed and reliability of emulsion analysis. Details of the application of this technique to the analysis of Fermilab experiment E653 are described. ((orig.))

  6. Statistical Analysis of the Progressive Failure Behavior for Fiber-Reinforced Polymer Composites under Tensile Loading

    Directory of Open Access Journals (Sweden)

    Fang Wang

    2014-01-01

    Full Text Available An analytical approach with the help of numerical simulations based on the equivalent constraint model (ECM was proposed to investigate the progressive failure behavior of symmetric fiber-reinforced composite laminates damaged by transverse ply cracking. A fracture criterion was developed to describe the initiation and propagation of the transverse ply cracking. This work was also concerned with a statistical distributions of the critical fracture toughness values with due consideration given to the scale size effect. The Monte Carlo simulation technique coupled with statistical analysis was applied to study the progressive cracking behaviors of composite structures, by considering the effects of lamina properties and lay-up configurations. The results deduced from the numerical procedure were in good agreement with the experimental results obtained for laminated composites formed by unidirectional fiber reinforced laminae with different orientations.

  7. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; Moseley, S. H.; Porst, J.-P.; Porter, F. S.; Sadleir, J. E.; Smith, S. J.

    2016-07-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an ^{55}Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  8. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  9. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  10. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  11. Recent Progress in Application of Internal Oxidation Technique in Nb3Sn Strands

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Xingchen [Fermilab; Peng, Xuan [Hyper Tech Research Inc.; Sumption, Michael [Ohio State U.; Collings, E. W. [Ohio State U.

    2016-10-13

    The internal oxidation technique can generate ZrO2 nano particles in Nb3Sn strands, which markedly refine the Nb3Sn grain size and boost the high-field critical current density (Jc). This article summarizes recent efforts on implementing this technique in practical Nb3Sn wires and adding Ti as a dopant. It is demonstrated that this technique can be readily incorporated into the present Nb3Sn conductor manufacturing technology. Powder-in-tube (PIT) strands with fine subelements (~25 µm) based on this technique were successfully fabricated, and proper heat treatments for oxygen transfer were explored. Future work for producing strands ready for applications is proposed.

  12. Adhesive Characterization and Progressive Damage Analysis of Bonded Composite Joints

    Science.gov (United States)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2014-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  13. Design, data analysis and sampling techniques for clinical research

    OpenAIRE

    Karthik Suresh; Sanjeev V Thomas; Geetha Suresh

    2011-01-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains...

  14. History, progress and prospect for controlled ecological life support technique in China

    Science.gov (United States)

    Guo, Shuangsheng

    2016-07-01

    Constructing controlled ecological life support system is an important supporting condition for carrying out manned deep-space exploration and extraterrestrial inhabitation and development in the future. In China, the controlled ecological life support technique has gone through a developmental process of more than twenty years, undergoing the course of from conceptual research, to key unit-level technique and key system-level integrated technique, and from ground-based simulated tests to spaceflight demonstrating test, and gained many important stagy harvests. In this paper, the present status, subsistent problems and next plans in the domain of CELSS techniques in China are introduced briefly, so as to play a referential role for promoting development of the techniques internationally.

  15. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  16. Memory Forensics: Review of Acquisition and Analysis Techniques

    Science.gov (United States)

    2013-11-01

    types of digital evidence investigated include images, text, video and audio files [1]. To date, digital forensic investigations have focused on the...UNCLASSIFIED Memory Forensics : Review of Acquisition and Analysis Techniques Grant Osborne Cyber and Electronic Warfare Division Defence Science and...Technology Organisation DSTO–GD–0770 ABSTRACT This document presents an overview of the most common memory forensics techniques used in the

  17. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  18. Analysis On Classification Techniques In Mammographic Mass Data Set

    OpenAIRE

    K.K.Kavitha; Dr.A.Kangaiammal

    2015-01-01

    Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such a...

  19. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  20. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  1. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  2. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  3. Parallelization of events generation for data analysis techniques

    CERN Document Server

    Lazzaro, A

    2010-01-01

    With the startup of the LHC experiments at CERN, the involved community is now focusing on the analysis of the collected data. The complexity of the data analyses will be a key factor for finding eventual new phenomena. For such a reason many data analysis tools have been developed in the last several years, which implement several data analysis techniques. Goal of these techniques is the possibility of discriminating events of interest and measuring parameters on a given input sample of events, which are themselves defined by several variables. Also particularly important is the possibility of repeating the determination of the parameters by applying the procedure on several simulated samples, which are generated using Monte Carlo techniques and the knowledge of the probability density functions of the input variables. This procedure achieves a better estimation of the results. Depending on the number of variables, complexity of their probability density functions, number of events, and number of sample to g...

  4. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    Science.gov (United States)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  5. Progressive abnormalities in the brain scan in adrenal leukodystrophy. [/sup 99m/Tc tracer technique

    Energy Technology Data Exchange (ETDEWEB)

    Chatterton, B.E.

    1977-11-01

    A case report is presented of a 10-yr-old boy with restless movements and deteriorated mental ability. A brain scan was performed using /sup 99m/Tc pertechnetate. A faint area of uptake appeared; 6 months later the brain scan showed more intense uptake and neurologic symptoms increased; the patient died soon after and autopsy showed leukodystrophy of the brain and adrenal atrophy. A study of previous cases indicates that adrenal leukodystrophy is a sex-linked hereditary disease in which progressive demyelination leads to dementia, cortical blindness, and spasticity. In all reported cases abnormal areas on the brain scan corresponded with pathologic changes. (HLW)

  6. The actuality and progress of whole sky infrared cloud remote sensing techniques

    Institute of Scientific and Technical Information of China (English)

    ZHANG; Ting; LIU; Lei; GAO; Taichang; HU; Shuai

    2015-01-01

    Clouds are crucial regulators of both weather and climate. Properties such as the amount,type,height,distribution and movement of them have an impact on the earth’s radiation budget and the hydrological cycle,thus cloud observation is very important. The disadvantages of zenith pointing measuring instruments and whole sky visible imagers limit the application of them.A summary of the actuality and application of ground-based whole sky infrared cloud measuring instruments and analyses of the techniques of radiometric calibrations,removal of atmospheric emission and calculation of cloud cover,amount,type are conducted to promote the automatically observation of the whole sky. Fully considering whole sky infrared cloud sounding theories,techniques and applications,there are still a lot of studies on improving the properties of instruments,enhancing the techniques of cloud base height measurements and establishing instrumental cloud classification criterion before actual operations.

  7. Data analysis techniques for nuclear and particle physicists

    CERN Document Server

    Pruneau, Claude

    2017-01-01

    This is an advanced data analysis textbook for scientists specializing in the areas of particle physics, nuclear physics, and related subfields. As a practical guide for robust, comprehensive data analysis, it focuses on realistic techniques to explain instrumental effects. The topics are relevant for engineers, scientists, and astroscientists working in the fields of geophysics, chemistry, and the physical sciences. The book serves as a reference for more senior scientists while being eminently accessible to advanced undergraduate and graduate students.

  8. Optimization Techniques for Analysis of Biological and Social Networks

    Science.gov (United States)

    2012-03-28

    systematic fashion under a unifying theoretical and algorithmic framework . Optimization, Complex Networks, Social Network Analysis, Computational...analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms, test and fine...exact solutions are presented. In [3], we introduce the variable objective search framework for combinatorial optimization. The method utilizes

  9. Treatment planning of adhesive additive rehabilitations: the progressive wax-up of the three-step technique.

    Science.gov (United States)

    Vailati, Francesca; Carciofo, Sylvain

    2016-01-01

    A full-mouth rehabilitation should be correctly planned from the start by using a diagnostic wax-up to reduce the potential for remakes, increased chair time, and laboratory costs. However, determining the clinical validity of an extensive wax-up can be complicated for clinicians who lack the experience of full-mouth rehabilitations. The three-step technique is a simplified approach that has been developed to facilitate the clinician's task. By following this technique, the diagnostic wax-up is progressively developed to the final outcome through the interaction between patient, clinician, and laboratory technician. This article provides guidelines aimed at helping clinicians and laboratory technicians to become more proactive in the treatment planning of full-mouth rehabilitations, by starting from the three major parameters of incisal edge position, occlusal plane position, and the vertical dimension of occlusion.

  10. What Child Analysis Can Teach Us about Psychoanalytic Technique.

    Science.gov (United States)

    Ablon, Steven Luria

    2014-01-01

    Child analysis has much to teach us about analytic technique. Children have an innate, developmentally driven sense of analytic process. Children in analysis underscore the importance of an understanding and belief in the therapeutic action of play, the provisional aspects of play, and that not all play will be understood. Each analysis requires learning a new play signature that is constantly reorganized. Child analysis emphasizes the emergence and integration of dissociated states, the negotiation of self-other relationships, the importance of co-creation, and the child's awareness of the analyst's sensibility. Child analysis highlights the robust nature of transference and how working through and repairing is related to the initiation of coordinated patterns of high predictability in the context of deep attachments. I will illustrate these and other ideas in the description of the analysis of a nine-year-old boy.

  11. Driving forces of change in environmental indicators an analysis based on divisia index decomposition techniques

    CERN Document Server

    González, Paula Fernández; Presno, Mª José

    2014-01-01

    This book addresses several index decomposition analysis methods to assess progress made by EU countries in the last decade in relation to energy and climate change concerns. Several applications of these techniques are carried out in order to decompose changes in both energy and environmental aggregates. In addition to this, a new methodology based on classical spline approximations is introduced, which provides useful mathematical and statistical properties. Once a suitable set of determinant factors has been identified, these decomposition methods allow the researcher to quantify the respec

  12. New Progress in High-Precision and High-Resolution Seismic Exploration Technique in Coal Industry of China

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    In the past twenty years, the proportion of coal in primary-energy consumption in China is generally between 71.3% and 76.5%. The output of coal was 1.374 billion tons in 1996, and 1.21 tons in 1998, which ranked first in the world. Now coal is mined mainly with mechanization in China, which is planned to reach 80% in major State-owned coal mines in 2000 according to the planning of the government (Li et al., 1998; Tang Dejin, 1998).Compared with the USA and Australia, China has more complex coal geological structures. Based on high-resolution seismic technique in coal exploration, a new seismic technique with high-precision and high-resolution (2-D and 3-D) has been developed for the purpose of detecting small geological structures in coal mine construction and production to meet the needs of large-scale popularization of mechanized coal mining in China. The technique is low in cost and requires a relatively short period of exploration, with high precision and wide-range applications. In the middle of the 1980s, it began to be used in pre-mining coal exploration on a trial basis, and entered the peak of exploration in the 1990s, which has made significant progress in providing high-precision geological results for the construction and production of coal industry in China, and is still in the ascendant.This paper discusses some new progress and the exploration capability and application range of the technique.

  13. Reduced Incidence of Slowly Progressive Heymann Nephritis in Rats Immunized With a Modified Vaccination Technique

    Directory of Open Access Journals (Sweden)

    Arpad Z. Barabas

    2006-01-01

    Full Text Available A slowly progressive Heymann nephritis (SPHN was induced in three groups of rats by weekly injections of a chemically modified renal tubular antigen in an aqueous medium. A control group of rats received the chemically unmodified version of the antigen in an aqueous solution. One group of SPHN rats were pre- and post-treated with weekly injections of IC made up of rKF3 and rarKF3 IgM antibody at antigen excess (MIC (immune complexes [ICs] containing sonicated ultracentrifuged [u/c] rat kidney fraction 3 [rKF3] antigen and IgM antibodies specific against the antigen, at slight antigen excess. One group of SPHN rats were post-treated with MIC 3 weeks after the induction of the disease and one group of SPHN animals received no treatment. The control group of rats received pre- and post-treatment with sonicated u/c rKF3.

  14. Research progress on the brewing techniques of new-type rice wine.

    Science.gov (United States)

    Jiao, Aiquan; Xu, Xueming; Jin, Zhengyu

    2017-01-15

    As a traditional alcoholic beverage, Chinese rice wine (CRW) with high nutritional value and unique flavor has been popular in China for thousands of years. Although traditional production methods had been used without change for centuries, numerous technological innovations in the last decades have greatly impacted on the CRW industry. However, reviews related to the technology research progress in this field are relatively few. This article aimed at providing a brief summary of the recent developments in the new brewing technologies for making CRW. Based on the comparison between the conventional methods and the innovative technologies of CRW brewing, three principal aspects were summarized and sorted, including the innovation of raw material pretreatment, the optimization of fermentation and the reform of sterilization technology. Furthermore, by comparing the advantages and disadvantages of these methods, various issues are addressed related to the prospect of the CRW industry.

  15. Learning Progressions and Teaching Sequences: A Review and Analysis

    Science.gov (United States)

    Duschl, Richard; Maeng, Seungho; Sezen, Asli

    2011-01-01

    Our paper is an analytical review of the design, development and reporting of learning progressions and teaching sequences. Research questions are: (1) what criteria are being used to propose a "hypothetical learning progression/trajectory" and (2) what measurements/evidence are being used to empirically define and refine a "hypothetical learning…

  16. Developing techniques for cause-responsibility analysis of occupational accidents.

    Science.gov (United States)

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  17. Review:Progress in research on mixing techniques for transverse injection flow fields in supersonic crossflows

    Institute of Scientific and Technical Information of China (English)

    Wei HUANG; Li YAN

    2013-01-01

    The transverse injection flow field has an important impact on the flowpath design of scramjet engines.At present a combination of the transverse injection scheme and any other flame holder has been widely employed in hypersonic propulsion systems to promote the mixing process between the fuel and the supersonic freestream;combustion efficiency has been improved thereby,as well as engine thrust.Research on mixing techniques for the transverse injection flow field is summarized from four aspects,namely the jet-to-crossflow pressure ratio,the geometric configuration of the injection port,the number of injection ports,and the injection angle.In conclusion,urgent investigations of mixing techniques of the transverse injection flow field are proposed,especially data mining in the quantitative analytical results for transverse injection flow field,based on results from multi-objective design optimization theory.

  18. Design, data analysis and sampling techniques for clinical research.

    Science.gov (United States)

    Suresh, Karthik; Thomas, Sanjeev V; Suresh, Geetha

    2011-10-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains various sampling methods that can be appropriately used in medical research with different scenarios and challenges.

  19. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    of a given process requires analysis of the underlying mechanisms, at best, at the molecular level. To reveal these mechanisms a number of different techniques may be applied: (1) detailed physiological studies, (2) metabolic flux analysis (MFA), (3) metabolic control analysis (MCA), (4) thermodynamic......Metabolic engineering has been defined as the purposeful modification of intermediary metabolism using recombinant DNA techniques. With this definition metabolic engineering includes: (1) inserting new pathways in microorganisms with the aim of producing novel metabolites, e.g., production...... of polyketides by Streptomyces; (2) production of heterologous peptides, e.g., production of human insulin, erythropoitin, and tPA; and (3) improvement of both new and existing processes, e.g., production of antibiotics and industrial enzymes. Metabolic engineering is a multidisciplinary approach, which involves...

  20. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  1. Types of Maize Virus Diseases and Progress in Virus Identification Techniques in China

    Institute of Scientific and Technical Information of China (English)

    Cui Yu; Zhang Ai-hong; Ren Ai-jun; Miao Hong-qin

    2014-01-01

    There are a total of more than 40 reported maize viral diseases worldwide. Five of them have reportedly occurred in China. They are maize rough dwarf disease, maize dwarf mosaic disease, maize streak dwarf disease, maize crimson leaf disease, maize wallaby ear disease and corn lethal necrosis disease. This paper reviewed their occurrence and distribution as well as virus identification techniques in order to provide a basis for virus identification and diagnosis in corn production.

  2. A Comparison of Imaging Techniques to Monitor Tumor Growth and Cancer Progression in Living Animals

    Directory of Open Access Journals (Sweden)

    Anne-Laure Puaux

    2011-01-01

    Full Text Available Introduction and Purpose. Monitoring solid tumor growth and metastasis in small animals is important for cancer research. Noninvasive techniques make longitudinal studies possible, require fewer animals, and have greater statistical power. Such techniques include FDG positron emission tomography (FDG-PET, magnetic resonance imaging (MRI, and optical imaging, comprising bioluminescence imaging (BLI and fluorescence imaging (FLI. This study compared the performance and usability of these methods in the context of mouse tumor studies. Methods. B16 tumor-bearing mice (n=4 for each study were used to compare practicality, performance for small tumor detection and tumor burden measurement. Using RETAAD mice, which develop spontaneous melanomas, we examined the performance of MRI (n=6 mice and FDG-PET (n=10 mice for tumor identification. Results. Overall, BLI and FLI were the most practical techniques tested. Both BLI and FDG-PET identified small nonpalpable tumors, whereas MRI and FLI only detected macroscopic, clinically evident tumors. FDG-PET and MRI performed well in the identification of tumors in terms of specificity, sensitivity, and positive predictive value. Conclusion. Each of the four methods has different strengths that must be understood before selecting them for use.

  3. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  4. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  5. Evaluation of Damping Using Frequency Domain Operational Modal Analysis Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Georgakis, Christos T.; Brincker, Rune

    2015-01-01

    Operational Modal Analysis (OMA) techniques provide in most cases reasonably accurate estimates of structural frequencies and mode shapes. In contrast though, they are known to often produce uncertain structural damping estimates, which is mainly due to inherent random and/or bias errors...... domain techniques, the Frequency Domain Decomposition (FDD) and the Frequency Domain Polyreference (FDPR). The response of a two degree-of-freedom (2DOF) system is numerically established with specified modal parameters subjected to white noise loading. The system identification is evaluated with well...

  6. Analysis On Classification Techniques In Mammographic Mass Data Set

    Directory of Open Access Journals (Sweden)

    Mrs. K. K. Kavitha

    2015-07-01

    Full Text Available Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such as Decision Tree Induction, Naïve Bayes , k-Nearest Neighbour (KNN classifiers in mammographic mass dataset.

  7. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  8. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  9. Progress in Micro-particle Adhesion Force Measurement Techniques%微颗粒黏附力测试技术研究进展

    Institute of Scientific and Technical Information of China (English)

    钟剑; 吴超

    2012-01-01

    为了解微颗粒黏附力测试技术的研究进展,以Particle、Adhesion、Measurement为主题词,在Ei Compendex,Ei Inspec 和Elsevier Science数据库中检索了2001-2010年收录的文献,共检索出150余篇有关微颗粒黏附力测试技术的文献.根据这些文献的发表时间、作者所属的国家和所研究的测试技术进行了统计分析.对各种微颗粒黏附力测试技术(AFM分离技术、微机械分离技术、离心分离技术、静电场分离技术、振动分离技术和激光分离技术)进行了分析和优缺点比较,并展望了该领域发展的前沿方向.%The adhesion and removal of micro-particles are very important for the quality control of many industrial processes. They are also directly related to the environmental pollution and occupational health. The study on micro-particle adhesion force measurement techniques is helpful to control micro-particles adhesion pollution and remove micro-particles on the surface. In order to understand the progress in the micro-particle adhesion force measurement techniques, the studies on this area are reviewed. Based on the databases (Ei Compendex, Ei Inspec, and Elsevier Science), three subject words (Particle, Adhesion, and Measurement) are used and about 150 papers in the area published in the years from 2001 to 2010 are found from the search results. The statistics analysis on the publication time, authors' countries, and topics of measurement techniques is conducted. Various micro-particle adhesion force measurement techniques including the AFM detachment technique, the micromechanical detachment technique, the centrifugal detachment technique, the electrostatic detachment technique, the vibration detachment technique, and the laser detachment technique are discussed. The advantages and disadvantages of those micro-particle adhesion force measurement techniques are compared with each other. In the end, the development directions on this field are pointed out.

  10. Golden glazes analysis by PIGE and PIXE techniques

    Science.gov (United States)

    Fonseca, M.; Luís, H.; Franco, N.; Reis, M. A.; Chaves, P. C.; Taborda, A.; Cruz, J.; Galaviz, D.; Fernandes, N.; Vieira, P.; Ribeiro, J. P.; Jesus, A. P.

    2011-12-01

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 °C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  11. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  12. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  13. Pulsed Photonuclear Assessment (PPA) Technique: CY 04 Year-end Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    J.L. Jones; W.Y. Yoon; K.J. Haskell; D.R. Norman; J.M. Zabriskie; J.W. Sterbentz; S.M. Watson; J.T. Johnson; B.D. Bennett; R.W. Watson; K. L. Folkman

    2005-05-01

    Idaho National Laboratory (INL), along with Los Alamos National Laboratory (LANL) and Idaho State University’s Idaho Accelerator Center (IAC), are developing an electron accelerator-based, photonuclear inspection technology for the detection of smuggled nuclear material within air-, rail-, and especially, maritime-cargo transportation containers. This CY04 report describes the latest developments and progress with the development of the Pulsed, Photonuclear Assessment (PPA) nuclear material inspection ystem, such as: (1) the identification of an optimal range of electron beam energies for interrogation applications, (2) the development of a new “cabinet safe” electron accelerator (i.e., Varitron II) to assess “cabinet safe-type” operations, (3) the numerical and experimental validation responses of nuclear materials placed within selected cargo configurations, 4) the fabrication and utilization of Calibration Pallets for inspection technology performance verification, 5) the initial technology integration of basic radiographic “imaging/mapping” with induced neutron and gamma-ray detection, 6) the characterization of electron beam-generated photon sources for optimal performance, 7) the development of experimentallydetermined Receiver-Operator-Characterization curves, and 8) several other system component assessments. This project is supported by the Department of Homeland Security and is a technology component of the Science & Technology Active Interrogation Portfolio entitled “Photofission-based Nuclear Material Detection and Characterization.”

  14. Progress of new label-free techniques for biosensors: a review.

    Science.gov (United States)

    Sang, Shengbo; Wang, Yajun; Feng, Qiliang; Wei, Ye; Ji, Jianlong; Zhang, Wendong

    2016-01-01

    The detection techniques used in biosensors can be broadly classified into label-based and label-free. Label-based detection relies on the specific properties of labels for detecting a particular target. In contrast, label-free detection is suitable for the target molecules that are not labeled or the screening of analytes which are not easy to tag. Also, more types of label-free biosensors have emerged with developments in biotechnology. The latest developed techniques in label-free biosensors, such as field-effect transistors-based biosensors including carbon nanotube field-effect transistor biosensors, graphene field-effect transistor biosensors and silicon nanowire field-effect transistor biosensors, magnetoelastic biosensors, optical-based biosensors, surface stress-based biosensors and other type of biosensors based on the nanotechnology are discussed. The sensing principles, configurations, sensing performance, applications, advantages and restriction of different label-free based biosensors are considered and discussed in this review. Most concepts included in this survey could certainly be applied to the development of this kind of biosensor in the future.

  15. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  16. Large areas elemental mapping by ion beam analysis techniques

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  17. Genetic analysis of frontotemporal dementia and progressive supra nuclear palsy

    OpenAIRE

    Ferrari, R.

    2014-01-01

    Genome-wide association study (GWAS) is an effective method for mapping genetic variants underlying common and complex diseases. This thesis describes the investigation of the disorders, frontotemporal dementia (FTD) and progressive supranuclear palsy (PSP). FTD affects the frontal/temporal lobes and presents behavioural changes (bvFTD), cognitive decline or language dysfunction (primary progressive aphasia [PPA]), whilst PSP affects predominantly the brain stem resulting in loss of balance, ...

  18. A Framework for Performing Multiscale Stochastic Progressive Failure Analysis of Composite Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2007-01-01

    A framework is presented that enables coupled multiscale analysis of composite structures. The recently developed, free, Finite Element Analysis-Micromechanics Analysis Code (FEAMAC) software couples the Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with ABAQUS to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. As a result, the stochastic nature of fiber breakage in composites can be simulated through incorporation of an appropriate damage and failure model that operates within MAC/GMC on the level of the fiber. Results are presented for the progressive failure analysis of a titanium matrix composite tensile specimen that illustrate the power and utility of the framework and address the techniques needed to model the statistical nature of the problem properly. In particular, it is shown that incorporating fiber strength randomness on multiple scales improves the quality of the simulation by enabling failure at locations other than those associated with structural level stress risers.

  19. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  20. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  1. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    Science.gov (United States)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  2. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  3. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  4. Progress report on reversal and substitute element technique for thread calibration on CMMs

    DEFF Research Database (Denmark)

    Carmignato, Simone; Larsen, Erik; Sobiecki, Rene

    This report is made as a part of the project EASYTRAC, an EU project under the programme Competitive and Sustainable Growth: Contract No. G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines. For thi......) - Germany and Tampere University of Technology (TUT) - Finland. The present report describes feasibility and preliminary results of a reversal and substitute element technique application for thread calibration.......This report is made as a part of the project EASYTRAC, an EU project under the programme Competitive and Sustainable Growth: Contract No. G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines...

  5. Technique of Hadamard transform microscope fluorescence image analysis

    Institute of Scientific and Technical Information of China (English)

    梅二文; 顾文芳; 曾晓斌; 陈观铨; 曾云鹗

    1995-01-01

    Hadamard transform spatial multiplexed imaging technique is combined with fluorescence microscope and an instrument of Hadamard transform microscope fluorescence image analysis is developed. Images acquired by this instrument can provide a lot of useful information simultaneously, including three-dimensional Hadamard transform microscope cell fluorescence image, the fluorescence intensity and fluorescence distribution of a cell, the background signal intensity and the signal/noise ratio, etc.

  6. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis

    OpenAIRE

    Heidmann, Iris; Schade-Kampmann, Grit; Lambalk, Joep; Ottiger, Marcel; Di Berardino, Marco

    2016-01-01

    Introduction An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a comm...

  7. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  8. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  9. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  10. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  11. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  12. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    Science.gov (United States)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  13. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Lubna Moin

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  14. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    Directory of Open Access Journals (Sweden)

    Alexander Hexemer

    2015-01-01

    Full Text Available The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS, new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.

  15. Advanced grazing-incidence techniques for modern soft-matter materials analysis.

    Science.gov (United States)

    Hexemer, Alexander; Müller-Buschbaum, Peter

    2015-01-01

    The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS), new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.

  16. Progress in bionic information processing techniques for an electronic nose based on olfactory models

    Institute of Scientific and Technical Information of China (English)

    LI Guang; FU Jun; ZHANG Jia; ZHENG JunBao

    2009-01-01

    As a novel bionic analytical technique, an electronic nose, inspired by the mechanism of the biological olfactory system and integrated with modern sensing technology, electronic technology and pattern recognition technology, has been widely used in many areas. Moreover, recent basic research findings in biological olfaction combined with computational neuroscience promote its development both in methodology and application. In this review, the basic information processing principle of biological olfaction and artificial olfaction are summarized and compared, and four olfactory models and their applications to electronic noses are presented. Finally, a chaotic olfactory neural network is detailed and the utilization of several biologically oriented learning rules and its spatiotemporal dynamic prop-ties for electronic noses are discussed. The integration of various phenomena and their mechanisms for biological olfaction into an electronic nose context for information processing will not only make them more bionic, but also perform better than conventional methods. However, many problems still remain, which should be solved by further cooperation between theorists and engineers.

  17. [Progress and prospects of research on information processing techniques for intelligent diagnosis of traditional Chinese medicine].

    Science.gov (United States)

    Zhou, Chang-Le; Zhang, Zhi-Feng

    2006-11-01

    Information processing for intelligent diagnosis of traditional Chinese medicine (TCM), an important part of the modernization of Chinese medicine, attracts world wide attention from the science circle. This article presents a systematic introduction to the development of information technology, especially the processing of pulse and tongue images and systems of computer-aided Chinese medical diagnosis. Furthermore, it points out four essential areas of future research, including epistemic logic system of syndrome differentiation, system construction technology, data miming technology and information acquisition and analysis in TCM diagnosis.

  18. Analysis of Social Work Theory Progression Published in 2004

    Directory of Open Access Journals (Sweden)

    Valerie D. Decker

    2007-05-01

    Full Text Available The authors reviewed 67 articles that discussed and/or tested human behavior theories from social work journals published in 2004 in order to assess the level and quality of theory progression. The articles were further sorted into Council on Social Work Education (CSWE Educational Policy and Accreditation Standards (EPAS Foundation Curriculum content areas of HBSE, practice, policy, field education, values & ethics, diversity, populations-at-risk/social and economic justice, and research for purposes of categorization. Results indicated that HBSE and practice were by far the largest group of articles reviewed.Also found was that social work has a limited amount of theory discussion in the content areas of field, values and ethics, diversity, and populations-at-risk/social and economic justice. Thirty-three articles were found to demonstrate theory progression, eight articles presented new/emerging theories, and 26 articles discussed or critiqued theories without presenting evidence of theory progression.

  19. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  20. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  1. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  2. Arc-length technique for nonlinear finite element analysis

    Institute of Scientific and Technical Information of China (English)

    MEMON Bashir-Ahmed; SU Xiao-zu(苏小卒)

    2004-01-01

    Nonlinear solution of reinforced concrete structures, particularly complete load-deflection response, requires tracing of the equilibrium path and proper treatment of the limit and bifurcation points. In this regard, ordinary solution techniques lead to instability near the limit points and also have problems in case of snap-through and snap-back. Thus they fail to predict the complete load-displacement response. The arc-length method serves the purpose well in principle, Received wide acceptance in finite element analysis, and has been used extensively. However modifications to the basic idea are vital to meet the particular needs of the analysis. This paper reviews some of the recent developments of the method in the last two decades, with particular emphasis on nonlinear finite element analysis of reinforced concrete structures.

  3. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  4. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  5. Application of thermal analysis techniques in activated carbon production

    Science.gov (United States)

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  6. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  7. Progress in CTEQ/TEA global QCD analysis

    CERN Document Server

    Nadolsky, P M; Lai, H -L; Pumplin, J; Yuan, C -P

    2009-01-01

    We overview progress in the development of general-purpose CTEQ PDFs. The preprint is based on four talks presented by H.-L. Lai and P. Nadolsky at the 17th International Workshop on Deep Inelastic Scattering and Related Subjects (DIS 2009).

  8. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  9. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    Science.gov (United States)

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  10. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    Science.gov (United States)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  11. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  12. The Analysis of the Thematic Progression Patterns in "The Great Learning"

    Institute of Scientific and Technical Information of China (English)

    王利娜; 金俊淑

    2007-01-01

    This paper intends to introduce briefly the thematic progression patterns in Systemic- Functional Grammar, then analyze its application in "The Great Learning" which is one of the classics of the Confucius and his disciples. The analysis of the thematic progression patterns of "The Great Learning" is meaningful for both understanding and appreciating "The Great Learning".

  13. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  14. Dynamic Range Analysis of the Phase Generated Carrier Demodulation Technique

    Directory of Open Access Journals (Sweden)

    M. J. Plotnikov

    2014-01-01

    Full Text Available The dependence of the dynamic range of the phase generated carrier (PGC technique on low-pass filters passbands is investigated using a simulation model. A nonlinear character of this dependence, which could lead to dynamic range limitations or measurement uncertainty, is presented for the first time. A detailed theoretical analysis is provided to verify the simulation results and these results are consistent with performed calculations. The method for the calculation of low-pass filters passbands according to the required dynamic range upper limit is proposed.

  15. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  16. Quality assurance and quantitative error analysis by tracer techniques

    Energy Technology Data Exchange (ETDEWEB)

    Schuetze, N.; Hermann, U.

    1983-12-01

    The locations, types and sources of casting defects have been tested by tracer techniques. Certain sites of moulds were labelled using /sup 199/Au, /sup 24/Na sodium carbonate solution, and technetium solution produced in the technetium generator on a /sup 99/Mo//sup 99/Tc elution column. Evaluations were made by means of activity measurements and autoradiography. The locations and causes of casting defects can be determined by error analysis. The surface defects of castings resulting from the moulding materials and from the blacking can be detected by technetium, the subsurface defects are located by gold.

  17. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...... contents and the requirement for the project prior to its start are described together with thee results obtained during the 3 year period of the project. The project was mainly carried out as a Ph.D project by the first author from September 1994 to August 1997 in cooperation with associate professor Rune...

  18. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  19. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    Science.gov (United States)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  20. Golden glazes analysis by PIGE and PIXE techniques

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, M., E-mail: mmfonseca@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Luis, H., E-mail: heliofluis@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Franco, N., E-mail: nfranco@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Reis, M.A., E-mail: mareis@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Chaves, P.C., E-mail: cchaves@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Taborda, A., E-mail: galaviz@cii.fc.ul.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Cruz, J., E-mail: jdc@fct.unl.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Galaviz, D., E-mail: ataborda@itn.pt [Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Dept. Fisica, Faculdade de Ciencias, Universidade de Lisboa, Lisboa (Portugal); and others

    2011-12-15

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 Degree-Sign C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  1. Analysis of Social Work Theory Progression Published in 2004

    OpenAIRE

    2007-01-01

    The authors reviewed 67 articles that discussed and/or tested human behavior theories from social work journals published in 2004 in order to assess the level and quality of theory progression. The articles were further sorted into Council on Social Work Education (CSWE) Educational Policy and Accreditation Standards (EPAS) Foundation Curriculum content areas of HBSE, practice, policy, field education, values & ethics, diversity, populations-at-risk/social and economic justice, and rese...

  2. Human Capital Investment and an Analysis of Its Progressive Profit

    Institute of Scientific and Technical Information of China (English)

    张德平; 孙诚

    2004-01-01

    Skilled labor force cultivated through putting in funds and time in their education are undoubtedly essential in the operation of sophisticated machines in production, but it is so also in the creation of new ideas and methods in production and other economic activities, and ultimately in the promotion of the progressive increase of material capital. Thus strengthening the investment of human capital and enriching the stock of human capital is of primary importance, especially for China, in the 21st century.

  3. A Comparative Analysis of Enlisted Career Progression Systems.

    Science.gov (United States)

    1980-06-01

    Promotional Requirements to Airframe Foreman . . . .. .. .. .. .. .. .. .. 119 xi LIST OF FIGURES Figure Page 1-1. Pyramid of Proposed Research...PERSPECTIVE AND OPINION• /SYSTEMS SURVE Y Fig. 1-1. Pyramid of Proposed Research-- USA? Enlisted Career Progression System 13 3. An examination of alternate...to believe (28:2). McIntire focuses on two of the experts in the field of job motivation and enrichment, Dr. Abraham Maslow and Frederick Herzberg

  4. BaTMAn: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  5. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  6. Stalked protozoa identification by image analysis and multivariable statistical techniques.

    Science.gov (United States)

    Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2008-06-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.

  7. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  8. Recovering prehistoric woodworking skills using spatial analysis techniques

    Science.gov (United States)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  9. Pressure transient analysis for long homogeneous reservoirs using TDS technique

    Energy Technology Data Exchange (ETDEWEB)

    Escobar, Freddy Humberto [Universidad Surcolombiana, Av. Pastrana - Cra. 1, Neiva, Huila (Colombia); Hernandez, Yuly Andrea [Hocol S.A., Cra. 7 No 114-43, Floor 16, Bogota (Colombia); Hernandez, Claudia Marcela [Weatherford, Cra. 7 No 81-90, Neiva, Huila (Colombia)

    2007-08-15

    A significant number of well pressure tests are conducted in long, narrow reservoirs with close and open extreme boundaries. It is desirable not only to appropriately identify these types of systems but also to develop an adequate and practical interpretation technique to determine their parameters and size, when possible. An accurate understanding of how the reservoir produces and the magnitude of producible reserves can lead to competent decisions and adequate reservoir management. So far, studies found for identification and determination of parameters for such systems are conducted by conventional techniques (semilog analysis) and semilog and log-log type-curve matching of pressure versus time. Type-curve matching is basically a trial-and-error procedure which may provide inaccurate results. Besides, a limitation in the number of type curves plays a negative role. In this paper, a detailed analysis of pressure derivative behavior for a vertical well in linear reservoirs with open and closed extreme boundaries is presented for the case of constant rate production. We studied independently each flow regime, especially the linear flow regime since it is the most characteristic 'fingerprint' of these systems. We found that when the well is located at one of the extremes of the reservoir, a single linear flow regime develops once radial flow and/or wellbore storage effects have ended. When the well is located at a given distance from both extreme boundaries, the pressure derivative permits the identification of two linear flows toward the well and it has been called that 'dual-linear flow regime'. This is characterized by an increment of the intercept of the 1/2-slope line from {pi}{sup 0.5} to {pi} with a consequent transition between these two straight lines. The identification of intersection points, lines, and characteristic slopes allows us to develop an interpretation technique without employing type-curve matching. This technique uses

  10. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  11. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    Science.gov (United States)

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  12. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  13. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  14. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real Integral-Field Spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of two. Our analysis reveals that the algorithm prioritizes conservation of all the statistically-significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BATMAN is not to be used as a `black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially-resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  15. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  16. Progress on radiochemical analysis for nuclear waste management in decommissioning

    Energy Technology Data Exchange (ETDEWEB)

    Hou, X. (Technical Univ. of Denmark. Center for Nuclear Technologies (NuTech), Roskilde (Denmark))

    2012-01-15

    This report summarized the progress in the development and improvement of radioanalytical methods for decommissioning and waste management completed in the NKS-B RadWaste 2011 project. Based on the overview information of the analytical methods in Nordic laboratories and requirement from the nuclear industry provided in the first phase of the RadWaste project (2010), some methods were improved and developed. A method for efficiently separation of Nb from nuclear waste especially metals for measurement of long-lived 94Nb by gamma spectrometry was developed. By systematic investigation of behaviours of technetium in sample treatment and chromatographic separation process, an effective method was developed for the determination of low level 99Tc in waste samples. An AMS approachment was investigated to measure ultra low level 237Np using 242Pu for AMS normalization, the preliminary results show a high potential of this method. Some progress on characterization of waste for decommissioning of Danish DR3 is also presented. (Author)

  17. An Archetypal Analysis on The Pilgrim’s Progress

    Institute of Scientific and Technical Information of China (English)

    杨洛琪

    2014-01-01

    John Bunyan (1628-1688) is one of the most remarkable figures in 17th century English literature.He is famous for his authorship of The Pilgrim’s Progress and becomes one of the world’s most widely-read Christian writers.This thesis attempts to use the archetypal theories to analyze the archetypes on Christian culture in The Pilgrim’s Progress.According to the theory of archetype, Bunyan’s use of biblical images and themes can be called archetypes.Therefore, this thesis tries to explore the underlying archetypal elements so as to represent its literary treasures by resorting to the theory of archetypal criticism.%约翰·班扬是十七世纪英国文学史上最伟大的作家之一,其著作《天路历程》让他成为最受欢迎的基督教作家。本文从原型批评理论的角度阐释了《天路历程》中来自基督教的文化原型。这些原型涉及作品的圣经意象:水和主题两个方面。从原型角度研究了《天路历程》的根据及重要性。

  18. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  19. Techniques of DNA methylation analysis with nutritional applications.

    Science.gov (United States)

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests.

  20. Methods for Progressive Collapse Analysis of Building Structures Under Blast and Impact Loads

    Institute of Scientific and Technical Information of China (English)

    LI Zhongxian; SHI Yanchao

    2008-01-01

    Progressive collapse of building structures under blast and impact lcads has attracted great attention all over the world.Prog ressive collapse analysis is essential for an economic and safe design of building structures against progressive collapse to blast and impact loads.Because of the catastrophic nature of progressive collapse and the potentially high cost of constructing or retrofitting buildings to resist it,is imperative that the progressive collapse analysis methods be reliable.For engineers.their methodology to carry out progressve collapse evaluation need not only be accurate and concise.but also be easily used and works fast.Thus,many researchers have been spending lots of effort in developing reliable,efficient and strajghtforward progressive collapse analysis methods recently.In the present paper,currenf progresslve collapse analysis meth ods available in the literature are reviewed.Their suitability,applicability and reliability are dis cussed.Our recent proposed new method for progressive collapse analysis of relnforced concrete frames under blast lcads is also introduced.

  1. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    Directory of Open Access Journals (Sweden)

    T. Subramani

    2014-06-01

    Full Text Available This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case study on natural base isolation using naturally available soils is presented. Also, the future areas of research are indicated. Earthquakes are one of nature IS greatest hazards; throughout historic time they have caused significant loss offline and severe damage to property, especially to man-made structures. On the other hand, earthquakes provide architects and engineers with a number of important design criteria foreign to the normal design process. From well established procedures reviewed by many researchers, seismic isolation may be used to provide an effective solution for a wide range of seismic design problems. The application of the base isolation techniques to protect structures against damage from earthquake attacks has been considered as one of the most effective approaches and has gained increasing acceptance during the last two decades. This is because base isolation limits the effects of the earthquake attack, a flexible base largely decoupling the structure from the ground motion, and the structural response accelerations are usually less than the ground acceleration. In general, the increase of additional viscous damping in the structure may reduce displacement and acceleration responses of the structure. This study also seeks to evaluate the effects of additional damping on the seismic response when compared with structures without additional damping for the different ground motions.

  2. 蓝莓深加工技术研究进展%Progress of Blueberry Deep Processing Technique Research

    Institute of Scientific and Technical Information of China (English)

    叶春苗

    2015-01-01

    Blueberry has high nutrition value and economic value. Blueberry deep processing is an effective way for solving the problem of blueberry hard storage. In the article, it stated the progress of blueberry deep processing technique research from the aspects of blue-berry dairy product processing, blueberry fruit juice and wine processing, blueberry jam processing and blueberry preserved fruit pro-cessing, in order to provide a reference for the development of blueberry industry.%蓝莓具有很高的营养价值和经济价值.对蓝莓进行深加工处理是解决蓝莓不耐贮藏问题的有效途径.从蓝莓乳制品加工、蓝莓果汁果酒饮料加工、蓝莓酱加工、蓝莓果脯加工方面综述蓝莓深加工技术的研究进展,以期为蓝莓产业的发展提供参考.

  3. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Directory of Open Access Journals (Sweden)

    Mahmoud I. Al-Kadi

    2013-05-01

    Full Text Available Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  4. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    Science.gov (United States)

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  5. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  6. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H. [Center for Experimental Nuclear Physics and Astrophysics, and Department of Physics, University of Washington, Seattle, WA (United States); Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L. [Pacific Northwest National Laboratory, Richland, WA (United States); Avignone, F. T. [Department of Physics and Astronomy, University of South Carolina, Columbia, SC (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States); Baldenegro-Barrera, C. X.; Bertrand, F. E. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  7. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  8. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  9. Radial velocity data analysis with compressed sensing techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2017-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian process framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick Observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  10. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  11. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  12. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  13. A review of signal processing techniques for heart sound analysis in clinical diagnosis.

    Science.gov (United States)

    Emmanuel, Babatunde S

    2012-08-01

    This paper presents an overview of approaches to analysis of heart sound signals. The paper reviews the milestones in the development of phonocardiogram (PCG) signal analysis. It describes the various stages involved in the analysis of heart sounds and discrete wavelet transform as a preferred method for bio-signal processing. In addition, the gaps that still exist between contemporary methods of signal analysis of heart sounds and their applications for clinical diagnosis is reviewed. A lot of progress has been made but crucial gaps still exist. The findings of this review paper are as follows: there is a lack of consensus in research outputs; inter-patient adaptability of signal processing algorithm is still problematic; the process of clinical validation of analysis techniques was not sufficiently rigorous in most of the reviewed literature; and as such data integrity and measurement are still in doubt, which most of the time led to inaccurate interpretation of results. In addition, the existing diagnostic systems are too complex and expensive. The paper concluded that the ability to correctly acquire, analyse and interpret heart sound signals for improved clinical diagnostic processes has become a priority.

  14. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  15. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  16. Pattern recognition software and techniques for biological image analysis.

    Science.gov (United States)

    Shamir, Lior; Delaney, John D; Orlov, Nikita; Eckley, D Mark; Goldberg, Ilya G

    2010-11-24

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  17. Efficient geometric rectification techniques for spectral analysis algorithm

    Science.gov (United States)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  18. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    Science.gov (United States)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  19. Development Progress of Segmented Gamma Scanning Analysis Equipment

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    <正>The measurement technology of segmented gamma scanning (SGS) has width application in fields of nuclear material and nuclear waster because its advantage of non-destroy analysis for non-uniform object.

  20. Research progress in nonlinear analysis of heart electric activities

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Nonlinear science research is a hot point in the world. It has deepened our cognition of determinism and randomicity, simplicity and complexity, noise and order and it will profoundly influence the progress of the study of natural science, including life science.Life is the most complex nonlinear system and heart is the core of lifecycle system. In the late more than 20 years, nonlinear research on heart electric activities has made much headway. The commonly used parameters are based on chaos and fractal theory, such as correlation dimension, Lyapunov exponent, Kolmogorov entropy and multifractal singularity spectrum. This paper summarizes the commonly used methods in the nonlinear study of heart electric signal. Then, considering the shortages of the above traditional nonlinear parameters, we mainly introduce the results on short-term heart rate variability (HRV) signal (500 R-R intervals) and HFECG signal (1-2s). Finally, we point out it is worthwhile to put emphasis on the study of the sensitive nonlinearity parameters of short-term heart electric signal and their dynamic character and clinical effectivity.

  1. Emerging techniques for soil analysis via mid-infrared spectroscopy

    Science.gov (United States)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  2. Demonstration Technology Application and Analysis on the Scientific and Technological Progress

    OpenAIRE

    Qingzhu Qi; Zhixiao Jiang

    2013-01-01

    This paper takes Tianjin for example and analyzes the development tend of scientific and technological progress in Tianjin. From five aspects as ‘environment of scientific and technological progress’, ‘input of scientific and technological activities’, ‘output of scientific and technological activities’, ‘high-tech industrialization’, ‘science and technology for economic and social development’, the paper analysis the correlation between GDP and scientific and technological progress. Research...

  3. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  4. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  5. Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer

    Science.gov (United States)

    2015-09-01

    AWARD NUMBER: W81XWH-14-1-0080 TITLE: Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer. PRINCIPAL INVESTIGATOR...extracts. All samples have undergone a comprehensive DNA methylome analysis using the Illumina 450K CpG arrays, with excellent call rates, the

  6. Molten metal analysis by laser produced plasmas. Technical progress report

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yong W.

    1994-02-01

    A new method of molten metal analysis, based on time- and space-resolved spectroscopy of a laser-produced plasma (LPP) plume of a molten metal surface, has been implemented in the form of a prototype LPP sensor-probe, allowing in-situ analysis in less than 1 minute. The research at Lehigh University has been structured in 3 phases: laboratory verification of concept, comparison of LPP method with conventional analysis of solid specimens and field trials of prototype sensor-probe in small-scale metal shops, and design/production/installation of two sensor-probes in metal production shops. Accomplishments in the first 2 phases are reported. 6 tabs, 3 figs.

  7. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled ``Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  8. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  9. Progress in Conceptual Design and Analysis of Advanced Rotorcraft

    Science.gov (United States)

    Yamauchi, Gloria K.

    2012-01-01

    This presentation will give information on Multi-Disciplinary Analysis and Technology Development, including it's objectives and how they will be met. In addition, it will also present recent highlights including the Lift-Offset Civil Design and it's study conclusions, as well as, the LCTR2 Propulsion Concept's study conclusions. Recent publications and future publications will also be discussed.

  10. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    Energy Technology Data Exchange (ETDEWEB)

    Amato, G. [ISTI-CNR, Area della Ricerca, Via Moruzzi 1, 56124, Pisa (Italy); Cristoforetti, G.; Legnaioli, S.; Lorenzetti, G.; Palleschi, V. [IPCF-CNR, Area della Ricerca, Via Moruzzi 1, 56124, Pisa (Italy); Sorrentino, F., E-mail: sorrentino@fi.infn.i [Dipartimento di Fisica e astronomia, Universita di Firenze, Polo Scientifico, via Sansone 1, 50019 Sesto Fiorentino (Italy); Istituto di Cibernetica CNR, via Campi Flegrei 34, 80078 Pozzuoli (Italy); Marwan Technology, c/o Dipartimento di Fisica ' E. Fermi' , Largo Pontecorvo 3, 56127 Pisa (Italy); Tognoni, E. [INO-CNR, Area della Ricerca, Via Moruzzi 1, 56124 Pisa (Italy)

    2010-08-15

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  11. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    Science.gov (United States)

    Amato, G.; Cristoforetti, G.; Legnaioli, S.; Lorenzetti, G.; Palleschi, V.; Sorrentino, F.; Tognoni, E.

    2010-08-01

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  12. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  13. Sentiment Analysis of Twitter tweets using supervised classification technique

    Directory of Open Access Journals (Sweden)

    Pranav Waykar

    2016-05-01

    Full Text Available Making use of social media for analyzing the perceptions of the masses over a product, event or a person has gained momentum in recent times. Out of a wide array of social networks, we chose Twitter for our analysis as the opinions expressed their, are concise and bear a distinctive polarity. Here, we collect the most recent tweets on users' area of interest and analyze them. The extracted tweets are then segregated as positive, negative and neutral. We do the classification in following manner: collect the tweets using Twitter API; then we process the collected tweets to convert all letters to lowercase, eliminate special characters etc. which makes the classification more efficient; the processed tweets are classified using a supervised classification technique. We make use of Naive Bayes classifier to segregate the tweets as positive, negative and neutral. We use a set of sample tweets to train the classifier. The percentage of the tweets in each category is then computed and the result is represented graphically. The result can be used further to gain an insight into the views of the people using Twitter about a particular topic that is being searched by the user. It can help corporate houses devise strategies on the basis of the popularity of their product among the masses. It may help the consumers to make informed choices based on the general sentiment expressed by the Twitter users on a product

  14. An evaluation of wind turbine blade cross section analysis techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Paquette, Joshua A.; Griffith, Daniel Todd; Laird, Daniel L.; Resor, Brian Ray

    2010-03-01

    The blades of a modern wind turbine are critical components central to capturing and transmitting most of the load experienced by the system. They are complex structural items composed of many layers of fiber and resin composite material and typically, one or more shear webs. Large turbine blades being developed today are beyond the point of effective trial-and-error design of the past and design for reliability is always extremely important. Section analysis tools are used to reduce the three-dimensional continuum blade structure to a simpler beam representation for use in system response calculations to support full system design and certification. One model simplification approach is to analyze the two-dimensional blade cross sections to determine the properties for the beam. Another technique is to determine beam properties using static deflections of a full three-dimensional finite element model of a blade. This paper provides insight into discrepancies observed in outputs from each approach. Simple two-dimensional geometries and three-dimensional blade models are analyzed in this investigation. Finally, a subset of computational and experimental section properties for a full turbine blade are compared.

  15. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  16. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  17. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  18. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    Science.gov (United States)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  19. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  20. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  1. Progress in SIFT-MS: breath analysis and other applications.

    Science.gov (United States)

    Spaněl, Patrik; Smith, David

    2011-01-01

    The development of selected ion flow tube mass spectrometry, SIFT-MS, is described from its inception as the modified very large SIFT instruments used to demonstrate the feasibility of SIFT-MS as an analytical technique, towards the smaller but bulky transportable instruments and finally to the current smallest Profile 3 instruments that have been located in various places, including hospitals and schools to obtain on-line breath analyses. The essential physics and engineering principles are discussed, which must be appreciated to design and construct a SIFT-MS instrument. The versatility and sensitivity of the Profile 3 instrument is illustrated by typical mass spectra obtained using the three precursor ions H(3)O(+), NO(+) and O(2)(+)·, and the need to account for differential ionic diffusion and mass discrimination in the analytical algorithms is emphasized to obtain accurate trace gas analyses. The performance of the Profile 3 instrument is illustrated by the results of several pilot studies, including (i) on-line real time quantification of several breath metabolites for cohorts of healthy adults and children, which have provided representative concentration/population distributions, and the comparative analyses of breath exhaled via the mouth and nose that identify systemic and orally-generated compounds, (ii) the enhancement of breath metabolites by drug ingestion, (iii) the identification of HCN as a marker of Pseudomonas colonization of the airways and (iv) emission of volatile compounds from urine, especially ketone bodies, and from skin. Some very recent developments are discussed, including the quantification of carbon dioxide in breath and the combination of SIFT-MS with GC and ATD, and their significance. Finally, prospects for future SIFT-MS developments are alluded to.

  2. Comparative Analysis of Vehicle Make and Model Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Faiza Ayub Syed

    2014-03-01

    Full Text Available Vehicle Make and Model Recognition (VMMR has emerged as a significant element of vision based systems because of its application in access control systems, traffic control and monitoring systems, security systems and surveillance systems, etc. So far a number of techniques have been developed for vehicle recognition. Each technique follows different methodology and classification approaches. The evaluation results highlight the recognition technique with highest accuracy level. In this paper we have pointed out the working of various vehicle make and model recognition techniques and compare these techniques on the basis of methodology, principles, classification approach, classifier and level of recognition After comparing these factors we concluded that Locally Normalized Harris Corner Strengths (LHNS performs best as compared to other techniques. LHNS uses Bayes and K-NN classification approaches for vehicle classification. It extracts information from frontal view of vehicles for vehicle make and model recognition.

  3. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  4. A computerised morphometric technique for the analysis of intimal hyperplasia.

    OpenAIRE

    Tennant, M; McGeachie, J K

    1991-01-01

    The aim of this study was to design, develop and employ a method for the acquisition of a significant data base of thickness measurements. The integration of standard histological techniques (step serial sectioning), modern computer technology and a personally developed software package (specifically designed for thickness measurement) produced a novel technique suitable for the task. The technique allowed the elucidation of a larger data set from tissue samples. Thus a detailed and accurate ...

  5. Biomechanical analysis of cross-country skiing techniques.

    Science.gov (United States)

    Smith, G A

    1992-09-01

    The development of new techniques for cross-country skiing based on skating movements has stimulated biomechanical research aimed at understanding the various movement patterns, the forces driving the motions, and the mechanical factors affecting performance. Research methods have evolved from two-dimensional kinematic descriptions of classic ski techniques to three-dimensional analyses involving measurement of the forces and energy relations of skating. While numerous skiing projects have been completed, most have focused on either the diagonal stride or the V1 skating technique on uphill terrain. Current understanding of skiing mechanics is not sufficiently complete to adequately assess and optimize an individual skier's technique.

  6. COMPARATIVE ANALYSIS OF SATELLITE IMAGE PRE-PROCESSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Sree Sharmila

    2013-01-01

    Full Text Available Satellite images are corrupted by noise in its acquisition and transmission. The removal of noise from the image by attenuating the high frequency image components, removes some important details as well. In order to retain the useful information and improve the visual appearance, an effective denoising and resolution enhancement techniques are required. In this research, Hybrid Directional Lifting (HDL technique is proposed to retain the important details of the image and improve the visual appearance. The Discrete Wavelet Transform (DWT based interpolation technique is developed for enhancing the resolution of the denoised image. The performance of the proposed techniques are tested by Land Remote-Sensing Satellite (LANDSAT images, using the quantitative performance measure, Peak Signal to Noise Ratio (PSNR and computation time to show the significance of the proposed techniques. The PSNR of the HDL technique increases 1.02 dB compared to the standard denoising technique and the DWT based interpolation technique increases 3.94 dB. From the experimental results it reveals that newly developed image denoising and resolution enhancement techniques improve the image visual quality with rich textures.

  7. Progress on Radiochemical Analysis for Nuclear Waste Management in Decommissioning

    DEFF Research Database (Denmark)

    Hou, Xiaolin; Qiao, Jixin; Shi, Keliang

    to these activities, the pure beta and alpha emitters have to be chemical separated from the matrix and other radionuclides before measurement. Although much effort has been carried out, the accurate determination of them is still a major challenge because of the complex matrix and high requirement in radiochemical...... separation of radionuclides. In order to improve and maintain the Nodic competence in analysis of radionculides in waste samples, a NKS B project on this topic was launched in 2009. During the first phase of the NKS-B RadWaste project (2009-2010), a good achivement has been reached on establishment...... for determination of long-lived 94Nb in the nuclear waste; (2) development of a sensitive method for measurement of 237Np using AMS; (3) improvement of analytical method for determinaiton of 99Tc using ICP-MS; (4) improvement of method for 14C measurement using LSC; and (5) Characterization of steel samples from...

  8. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  9. Progress of Vascular Cast Technique%人体血管铸型技术的研究进展

    Institute of Scientific and Technical Information of China (English)

    潘雪梅; 周军

    2012-01-01

    Objective:To explore the preparation of vascular cast and its application in anatomy and clinical medicine. Methods:' Vascular, cast, anatomy, vein and artery' were searched as key words by CNKI and PubMed series full - text database retrieval systems from Jan. 1991 to Dec. 2011. Total 2 046 Chinese papers and 197 English papers of literatures were obtained. Reading related literature,summaring progress of vascular cast technique. Results: Vascular cast is an established method of anatomical preparation which has built models showed artery and vein vascular network for diverseness organs. It shows the three - dimensional morphological of vessel. It provides a basis on the surgery preserving artery and the treatment of intervention in vascular. Conclusions:Vascular cast has been proven to be an excellent method for the examination of vessel. With effort for about 30 years, it has been matured. It has revived recently with the development of anatomy and clin-cical medicine.%目的:探讨血管铸型的制作及其在解剖学、临床等学科的应用进展.方法:登录CNKI及PubMed期刊全文数据库检索系统,以“血管、铸型、静脉、动脉”等为关键词,检索1991年1月~2011年12月的相关文献,共检索到中文文献2 046条,英文文献197条;阅读相关文献,并总结血管铸型技术及其应用进展.结果:血管铸型技术为解剖标本制作已确定的方法,已构建出全身多个器官的动脉、静脉血管网铸型标本.铸型标本可显示脉管系统的三维立体结构,可为临床各种保留血管的术式、血管内介入治疗等提供依据.结论:血管铸型是研究微血管系统的可靠手段,经过如多年的发展已日趋成熟.随着解剖学、临床等学科的发展,近年来有复兴的趋势.

  10. Accident progression event tree analysis for postulated severe accidents at N Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M. (Sandia National Labs., Albuquerque, NM (USA)); Medford, G.T. (Science Applications International Corp., Albuquerque, NM (USA))

    1990-06-01

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied.

  11. Error Analysis for the Airborne Direct Georeferincing Technique

    Science.gov (United States)

    Elsharkawy, Ahmed S.; Habib, Ayman F.

    2016-10-01

    Direct Georeferencing was shown to be an important alternative to standard indirect image orientation using classical or GPS-supported aerial triangulation. Since direct Georeferencing without ground control relies on an extrapolation process only, particular focus has to be laid on the overall system calibration procedure. The accuracy performance of integrated GPS/inertial systems for direct Georeferencing in airborne photogrammetric environments has been tested extensively in the last years. In this approach, the limiting factor is a correct overall system calibration including the GPS/inertial component as well as the imaging sensor itself. Therefore remaining errors in the system calibration will significantly decrease the quality of object point determination. This research paper presents an error analysis for the airborne direct Georeferencing technique, where integrated GPS/IMU positioning and navigation systems are used, in conjunction with aerial cameras for airborne mapping compared with GPS/INS supported AT through the implementation of certain amount of error on the EOP and Boresight parameters and study the effect of these errors on the final ground coordinates. The data set is a block of images consists of 32 images distributed over six flight lines, the interior orientation parameters, IOP, are known through careful camera calibration procedure, also 37 ground control points are known through terrestrial surveying procedure. The exact location of camera station at time of exposure, exterior orientation parameters, EOP, is known through GPS/INS integration process. The preliminary results show that firstly, the DG and GPS-supported AT have similar accuracy and comparing with the conventional aerial photography method, the two technologies reduces the dependence on ground control (used only for quality control purposes). Secondly, In the DG Correcting overall system calibration including the GPS/inertial component as well as the imaging sensor itself

  12. Advanced patch-clamp techniques and single-channel analysis

    NARCIS (Netherlands)

    Biskup, B; Elzenga, JTM; Homann, U; Thiel, G; Wissing, F; Maathuis, FJM

    1999-01-01

    Much of our knowledge of ion-transport mechanisms in plant cell membranes comes from experiments using voltage-clamp. This technique allows the measurement of ionic currents across the membrane, whilst the voltage is held under experimental control. The patch-clamp technique was developed to study t

  13. Progression Analysis and Stage Discovery in Continuous Physiological Processes Using Image Computing

    Directory of Open Access Journals (Sweden)

    Ferrucci Luigi

    2010-01-01

    Full Text Available We propose an image computing-based method for quantitative analysis of continuous physiological processes that can be sensed by medical imaging and demonstrate its application to the analysis of morphological alterations of the bone structure, which correlate with the progression of osteoarthritis (OA. The purpose of the analysis is to quantitatively estimate OA progression in a fashion that can assist in understanding the pathophysiology of the disease. Ultimately, the texture analysis will be able to provide an alternative OA scoring method, which can potentially reflect the progression of the disease in a more direct fashion compared to the existing clinically utilized classification schemes based on radiology. This method can be useful not just for studying the nature of OA, but also for developing and testing the effect of drugs and treatments. While in this paper we demonstrate the application of the method to osteoarthritis, its generality makes it suitable for the analysis of other progressive clinical conditions that can be diagnosed and prognosed by using medical imaging.

  14. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  15. Single cell and single molecule techniques for the analysis of the epigenome

    Science.gov (United States)

    Wallin, Christopher Benjamin

    Epigenetic regulation is a critical biological process for the health and development of a cell. Epigenetic regulation is facilitated by covalent modifications to the underlying DNA and chromatin proteins. A fundamental understanding of these epigenetic modifications and their associated interactions at the molecular scale is necessary to explain phenomena including cellular identity, stem cell plasticity, and neoplastic transformation. It is widely known that abnormal epigenetic profiles have been linked to many diseases, most notably cancer. While the field of epigenetics has progressed rapidly with conventional techniques, significant advances remain to be made with respect to combinatoric analysis of epigenetic marks and single cell epigenetics. Therefore, in this dissertation, I will discuss our development of devices and methodologies to address these pertinent issues. First, we designed a preparatory polydimethylsiloxane (PDMS) microdevice for the extraction, purification, and stretching of human chromosomal DNA and chromatin from small cell populations down to a single cell. The valveless device captures cells by size exclusion within the micropillars, entraps the DNA or chromatin in the micropillars after cell lysis, purifies away the cellular debris, and fluorescently labels the DNA and/or chromatin all within a single reaction chamber. With the device, we achieve nearly 100% extraction efficiency of the DNA. The device is also used for in-channel immunostaining of chromatin followed by downstream single molecule chromatin analysis in nanochannels (SCAN). Second, using multi-color, time-correlated single molecule measurements in nanochannels, simultaneous coincidence detection of 2 epigenetic marks is demonstrated. Coincidence detection of 3 epigenetic marks is also established using a pulsed interleaved excitation scheme. With these two promising results, genome-wide quantification of epigenetic marks was pursued. Unfortunately, quantitative SCAN never

  16. Multidimensional scaling technique for analysis of magnetic storms at Indian observatories

    Indian Academy of Sciences (India)

    M Sridharan; A M S Ramasamy

    2002-12-01

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  17. In Vivo Imaging Techniques: A New Era for Histochemical Analysis

    Science.gov (United States)

    Busato, A.; Feruglio, P. Fumene; Parnigotto, P.P.; Marzola, P.; Sbarbati, A.

    2016-01-01

    In vivo imaging techniques can be integrated with classical histochemistry to create an actual histochemistry of water. In particular, Magnetic Resonance Imaging (MRI), an imaging technique primarily used as diagnostic tool in clinical/preclinical research, has excellent anatomical resolution, unlimited penetration depth and intrinsic soft tissue contrast. Thanks to the technological development, MRI is not only capable to provide morphological information but also and more interestingly functional, biophysical and molecular. In this paper we describe the main features of several advanced imaging techniques, such as MRI microscopy, Magnetic Resonance Spectroscopy, functional MRI, Diffusion Tensor Imaging and MRI with contrast agent as a useful support to classical histochemistry. PMID:28076937

  18. Image analysis techniques associated with automatic data base generation.

    Science.gov (United States)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  19. Cepstrum Analysis: An Advanced Technique in Vibration Analysis of Defects in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    M. Satyam

    1994-01-01

    Full Text Available Conventional frequency analysis in machinery vibration is not adequate to find out accurately defects in gears, bearings, and blades where sidebands and harmonics are present. Also such an approach is dependent on the transmission path. On the other hand, cepstrum analysis accurately identifies harmonics and sideband families and is a better technique available for fault diagnosis in gears, bearings, and turbine blades of ships and submarines. Cepstrum represents the global power content of a whole family of harmonics and sidebands when more than one family of sidebands are presents at the same time. Also it is insensitive to the transmission path effects since source and transmission path effects are additive and can be separated in cepstrum. The concept, underlying theory and the measurement and analysis involved for using the technique are briefly outlined. Two cases were taken to demonstrate advantage of cepstrum technique over the spectrum analysis. An LP compressor was chosen to study the transmission path effects and a marine gearbox having two sets of sideband families was studied to diagnose the problematic sideband and its severity.

  20. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  1. RAPD analysis : a rapid technique for differentation of spoilage yeasts

    NARCIS (Netherlands)

    Baleiras Couto, M.M.; Vossen, J.M.B.M. van der; Hofstra, H.; Huis in 't Veld, J.H.J.

    1994-01-01

    Techniques for the identification of the spoilage yeasts Saccharomyces cerevisiae and members of the Zygosaccharomyces genus from food and beverages sources were evaluated. The use of identification systems based on physiological characteristics resulted often in incomplete identification or misiden

  2. Reticle defect sizing of optical proximity correction defects using SEM imaging and image analysis techniques

    Science.gov (United States)

    Zurbrick, Larry S.; Wang, Lantian; Konicek, Paul; Laird, Ellen R.

    2000-07-01

    Sizing of programmed defects on optical proximity correction (OPC) feature sis addressed using high resolution scanning electron microscope (SEM) images and image analysis techniques. A comparison and analysis of different sizing methods is made. This paper addresses the issues of OPC defect definition and discusses the experimental measurement results obtained by SEM in combination with image analysis techniques.

  3. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  4. A quantitative analysis of rotary, ultrasonic and manual techniques to treat proximally flattened root canals

    Directory of Open Access Journals (Sweden)

    Fabiana Soares Grecca

    2007-04-01

    Full Text Available OBJECTIVE: The efficiency of rotary, manual and ultrasonic root canal instrumentation techniques was investigated in proximally flattened root canals. MATERIAL AND METHODS: Forty human mandibular left and right central incisors, lateral incisors and premolars were used. The pulp tissue was removed and the root canals were filled with red die. Teeth were instrumented using three techniques: (i K3 and ProTaper rotary systems; (ii ultrasonic crown-down technique; and (iii progressive manual technique. Roots were bisected longitudinally in a buccolingual direction. The instrumented canal walls were digitally captured and the images obtained were analyzed using the Sigma Scan software. Canal walls were evaluated for total canal wall area versus non-instrumented area on which dye remained. RESULTS: No statistically significant difference was found between the instrumentation techniques studied (p<0.05. CONCLUSION: The findings of this study showed that no instrumentation technique was 100% efficient to remove the dye.

  5. Analysis Of Machine Learning Techniques By Using Blogger Data

    Directory of Open Access Journals (Sweden)

    Gowsalya.R,

    2014-04-01

    Full Text Available Blogs are the recent fast progressing media which depends on information system and technological advancement. The mass media is not much developed for the developing countries are in government terms and their schemes are developed based on governmental concepts, so blogs are provided for knowledge and ideas sharing. This article has highlighted and performed simulations from obtained information, 100 instances of Bloggers by using Weka 3. 6 Tool, and by applying many machine learning algorithms and analyzed with the values of accuracy, precision, recall and F-measure for getting future tendency anticipation of users to blogging and using in strategical areas. Keywords -

  6. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  7. Comparative study of Authorship Identification Techniques for Cyber Forensics Analysis

    Directory of Open Access Journals (Sweden)

    Smita Nirkhi

    2013-06-01

    Full Text Available Authorship Identification techniques are used to identify the most appropriate author from group of potential suspects of online messages and find evidences to support the conclusion. Cybercriminals make misuse of online communication for sending blackmail or a spam email and then attempt to hide their true identities to void detection.Authorship Identification of online messages is the contemporary research issue for identity tracing in cyber forensics. This is highly interdisciplinary area as it takes advantage of machine learning, information retrieval, and natural language processing. In this paper, a study of recent techniques and automated approaches to attributing authorship of online messages is presented. The focus of this review study is to summarize all existing authorship identification techniques used in literature to identify authors of online messages. Also it discusses evaluation criteria and parameters for authorship attribution studies and list open questions that will attract future work in this area.

  8. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  9. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  10. New techniques for positron emission tomography in the study of human neurological disorders. Progress report, June 1990--June 1993

    Energy Technology Data Exchange (ETDEWEB)

    Kuhl, D.E.

    1993-06-01

    This progress report describes accomplishments of four programs. The four programs are entitled (1) Faster,simpler processing of positron-computing precursors: New physicochemical approaches, (2) Novel solid phase reagents and methods to improve radiosynthesis and isotope production, (3) Quantitative evaluation of the extraction of information from PET images, and (4) Optimization of tracer kinetic methods for radioligand studies in PET.

  11. 植物DNA条形码技术的发展及应用%Progress and application of DNA barcoding technique in plants

    Institute of Scientific and Technical Information of China (English)

    刘宇婧; 刘越; 黄耀江; 龙春林

    2011-01-01

    Based on summarization and analysis of development process of DNA barcoding technique,research progress of DNA barcoding technique in plants, its working process and analysis method,influencing factors on its identification accuracy, its application status and dispute existing in plant taxonomic study were comprehensively analyzed and described, and further development trend and application prospect of DNA barcoding technique in plants were proposed. By means of some research examples, it was indicated that combining method of DNA barcoding technique in plants and traditional botanical knowledge could be taken as one of studying ways for ethnobotany. And also, it was suggested that common DNA barcoding in plants mainly were two modes of single fragment and multiple fragments combining, and both of them had their respective advantages and disadvantages. Common DNA sequences included matK, trnH-psbA, rbcL and ITS, etc, but they all had a certain limitation. Different standards of DNA barcoding in plants should be selected in order to different application aims.Influencing factors on its identification accuracy included type and number of species, construction method of phylogenetic tree, hybridization and gene introgression, variance of species origin time and variance of molecular evolution rate. Current focus on DNA barcoding in plants is how to select suitable DNA fragments and to evaluate their values.%在对DNA条形码技术的发展过程进行归纳分析的基础上,对植物DNA条形码技术的研究进展、工作流程及分析方法、影响其鉴定准确性的因素及其在植物分类学研究中的应用现状及存在的争议进行了综合分析和阐述,并展望了植物DNA条形码技术的发展趋势及应用前景.通过具体实例说明将植物DNA条形码技术与传统植物学知识相结合可作为民族植物学的研究手段之一.认为:目前常用的植物DNA条形码主要有单一片段和多片段组合2种方式,这2种

  12. Tape Stripping Technique for Stratum Corneum Protein Analysis

    DEFF Research Database (Denmark)

    Clausen, Maja-Lisa; Slotved, H-C; Krogfelt, Karen A

    2016-01-01

    The aim of this study was to investigate the amount of protein in stratum corneum in atopic dermatitis (AD) patients and healthy controls, using tape stripping technique. Furthermore, to compare two different methods for protein assessment. Tape stripping was performed in AD patients and healthy...

  13. Analysis on Poe's Unique Techniques to Achieve Aestheticism

    Institute of Scientific and Technical Information of China (English)

    孔佳鸣

    2008-01-01

    Edgar Allan Poe was one of the most important poets in the American poetic history for his unremitting pursuit for ‘ideal beauty'.This essay proves by various examples chosen from his poems that his aestheticism was obvious in his versification techniques.His poetic theory and practice gave an immortal example for the development of the English poetry.

  14. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente;

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four aque...

  15. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  16. Rates of progression in diabetic retinopathy during different time periods: a systematic review and meta-analysis

    DEFF Research Database (Denmark)

    Wong, Tien Y; Mwamburi, Mkaya; Klein, Ronald;

    2009-01-01

    This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends.......This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends....

  17. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...steps are applicable for a software HAZOP . 2 Plan HAZOP Establish HAZOP analysis goals, definitions, worksheets, schedule and process. Divide the...Subtle Incorrect Output’s value is wrong, but cannot be detected Table 3. HAZOP Guide Words for Software or System Interface Analysis31 The

  18. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  19. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  20. Recent Progresses in Analysis of Tongue Manifestation for Traditional Chinese Medicine

    Institute of Scientific and Technical Information of China (English)

    WEI Bao-guo; CAI Yi-heng; ZHANG Xin-feng; SHEN Lan-sun

    2005-01-01

    Tongue diagnosis is one of the most precious and widely used diagnostic methods in Traditional Chinese Medicine (TCM). However, due to its subjective, qualitative and experience-dependent nature, the studies on tongue characterization have been widely emphasized. This paper surveys recent progresses in analysis of tongue manifestation. These new developments include the cross-network and cross-media color reproduction of tongue image, the automatic segmentation of tongue body based on knowledge, the automatic analysis of curdiness and griminess for the tongue fur and the automatic analysis of plumpness, wryness and dot -thorn of tongue body. The clinic experiments verify the validity of these new methods.

  1. User-defined Material Model for Thermo-mechanical Progressive Failure Analysis

    Science.gov (United States)

    Knight, Norman F., Jr.

    2008-01-01

    Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.

  2. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  3. Assessing Progress towards Public Health, Human Rights, and International Development Goals Using Frontier Analysis.

    Science.gov (United States)

    Luh, Jeanne; Cronk, Ryan; Bartram, Jamie

    2016-01-01

    Indicators to measure progress towards achieving public health, human rights, and international development targets, such as 100% access to improved drinking water or zero maternal mortality ratio, generally focus on status (i.e., level of attainment or coverage) or trends in status (i.e., rates of change). However, these indicators do not account for different levels of development that countries experience, thus making it difficult to compare progress between countries. We describe a recently developed new use of frontier analysis and apply this method to calculate country performance indices in three areas: maternal mortality ratio, poverty headcount ratio, and primary school completion rate. Frontier analysis is used to identify the maximum achievable rates of change, defined by the historically best-performing countries, as a function of coverage level. Performance indices are calculated by comparing a country's rate of change against the maximum achievable rate at the same coverage level. A country's performance can be positive or negative, corresponding to progression or regression, respectively. The calculated performance indices allow countries to be compared against each other regardless of whether they have only begun to make progress or whether they have almost achieved the target. This paper is the first to use frontier analysis to determine the maximum achievable rates as a function of coverage level and to calculate performance indices for public health, human rights, and international development indicators. The method can be applied to multiple fields and settings, for example health targets such as cessation in smoking or specific vaccine immunizations, and offers both a new approach to analyze existing data and a new data source for consideration when assessing progress achieved.

  4. A COMPARISON OF SOME STATISTICAL TECHNIQUES FOR ROAD ACCIDENT ANALYSIS

    NARCIS (Netherlands)

    OPPE, S INST ROAD SAFETY RES, SWOV

    1992-01-01

    At the TRRL/SWOV Workshop on Accident Analysis Methodology, heldin Amsterdam in 1988, the need to establish a methodology for the analysis of road accidents was firmly stated by all participants. Data from different countries cannot be compared because there is no agreement on research methodology,

  5. A Survey of Techniques for Security Architecture Analysis

    Science.gov (United States)

    2003-05-01

    Effects Analysis FPG Failure Propagation Graph FTA Fault Tree Analysis HAZOP Hazard and Operability studies IATF Information Assurance Technical...represent logical places, within an information system, where people can perform their work by means of software acting on their behalf. People who...Describes the resources used to support the DIE (Including, for example, hardware, software , communication networks, applications and qualified staff

  6. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  7. Facilitating the analysis of immunological data with visual analytic techniques.

    Science.gov (United States)

    Shih, David C; Ho, Kevin C; Melnick, Kyle M; Rensink, Ronald A; Kollmann, Tobias R; Fortuno, Edgardo S

    2011-01-02

    Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.

  8. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  9. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  10. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  11. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  12. Magnetic resonance elastography (MRE) in cancer: Technique, analysis, and applications

    Science.gov (United States)

    Pepin, Kay M.; Ehman, Richard L.; McGee, Kiaran P.

    2015-01-01

    Tissue mechanical properties are significantly altered with the development of cancer. Magnetic resonance elastography (MRE) is a noninvasive technique capable of quantifying tissue mechanical properties in vivo. This review describes the basic principles of MRE and introduces some of the many promising MRE methods that have been developed for the detection and characterization of cancer, evaluation of response to therapy, and investigation of the underlying mechanical mechanisms associated with malignancy. PMID:26592944

  13. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  14. Infrared Spectroscopy of Explosives Residues: Measurement Techniques and Spectral Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Mark C.; Bernacki, Bruce E.

    2015-03-11

    Infrared laser spectroscopy of explosives is a promising technique for standoff and non-contact detection applications. However, the interpretation of spectra obtained in typical standoff measurement configurations presents numerous challenges. Understanding the variability in observed spectra from explosives residues and particles is crucial for design and implementation of detection algorithms with high detection confidence and low false alarm probability. We discuss a series of infrared spectroscopic techniques applied toward measuring and interpreting the reflectance spectra obtained from explosives particles and residues. These techniques utilize the high spectral radiance, broad tuning range, rapid wavelength tuning, high scan reproducibility, and low noise of an external cavity quantum cascade laser (ECQCL) system developed at Pacific Northwest National Laboratory. The ECQCL source permits measurements in configurations which would be either impractical or overly time-consuming with broadband, incoherent infrared sources, and enables a combination of rapid measurement speed and high detection sensitivity. The spectroscopic methods employed include standoff hyperspectral reflectance imaging, quantitative measurements of diffuse reflectance spectra, reflection-absorption infrared spectroscopy, microscopic imaging and spectroscopy, and nano-scale imaging and spectroscopy. Measurements of explosives particles and residues reveal important factors affecting observed reflectance spectra, including measurement geometry, substrate on which the explosives are deposited, and morphological effects such as particle shape, size, orientation, and crystal structure.

  15. Quality Analysis of Pear Fruit of Shah Miveh variety Using Nondestructive Ultrasonic Technique

    Directory of Open Access Journals (Sweden)

    R Meamar Dastjerdi

    2014-09-01

    Full Text Available Development of ultrasound technique has not been progressing for evaluating the internal quality of fruits as fast as that of processed foods. In this research for quality assessment of pear fruit (Shah Miveh variety an ultrasonic measurement system was constructed to transmit and receive the ultrasonic waves. The apparatus included a pulser-receiver, a pair of 75 kHz ultrasonic transducers with exponential horn, and a computer system for data acquisition and analysis. Several mechanical and chemical properties, including firmness, TSS, acidity, elastic modulus, pH and total dry matter for destructive quality assessment were measured. Velocity and attenuation of ultrasonic waves for nondestructive tests were also measured. The fruit quality levels for the experiment were: unripe, ripe and overripe. The results of tests showed that firmness was the best parameter for measuring fruit quality, as it decreased significantly with ripeness. The effect of ripeness on the velocity and attenuation of ultrasonic waves was also significant. Investigation showed a positive linear relationship between fruit firmness and wave velocity (R2=0.81. Furthermore, the relationship between fruit firmness and attenuation was exponential and wave attenuation decreased with increasing fruit firmness (R2=0.895. The Relationship between ultrasonic properties and fruit modulus of elasticity showed that the wave velocity increased and attenuation decreased with ‌increasing elasticity. It can be concluded that the ultrasonic instrument equipped with exponential horns can effectively be utilized for pear quality assessment based on measurement of wave velocity and attenuation.

  16. Progress on clinical application of Intraosseous infusion technique%骨内输液术临床应用进展

    Institute of Scientific and Technical Information of China (English)

    王宗华; 陈陵; 任辉; 李向红; 刘迎春

    2012-01-01

    It reviewed the application and developing status quo of intraosse-ous infusion technique at home and in abroad, and operation procedure andnursing announcements of intraosseous infusion technique%介绍了骨内输液技术在国内外的应用和发展现状、骨内输液技术的操作流程和护理注意事项.

  17. Automated image analysis techniques for cardiovascular magnetic resonance imaging

    NARCIS (Netherlands)

    Geest, Robertus Jacobus van der

    2011-01-01

    The introductory chapter provides an overview of various aspects related to quantitative analysis of cardiovascular MR (CMR) imaging studies. Subsequently, the thesis describes several automated methods for quantitative assessment of left ventricular function from CMR imaging studies. Several novel

  18. Plasma Exchange for Renal Vasculitis and Idiopathic Rapidly Progressive Glomerulonephritis: A Meta-analysis

    DEFF Research Database (Denmark)

    Walsh, Michael; Catapano, Fausta; Szpirt, Wladimir;

    2010-01-01

    BACKGROUND:: Plasma exchange may be effective adjunctive treatment for renal vasculitis. We performed a systematic review and meta-analysis of randomized controlled trials of plasma exchange for renal vasculitis. STUDY DESIGN:: Systematic review and meta-analysis of articles identified from...... electronic databases, bibliographies, and studies identified by experts. Data were abstracted in parallel by 2 reviewers. SETTING & POPULATION:: Adults with idiopathic renal vasculitis or rapidly progressive glomerulonephritis. SELECTION CRITERIA FOR STUDIES:: Randomized controlled trials that compared...... standard care with standard care plus adjuvant plasma exchange in adult patients with either renal vasculitis or idiopathic rapidly progressive glomerulonephritis. INTERVENTION:: Adjuvant plasma exchange. OUTCOME:: Composite of end-stage renal disease or death. RESULTS:: We identified 9 trials including...

  19. Nonlinear analysis of the progressive collapse of reinforced concrete plane frames using a multilayered beam formulation

    Directory of Open Access Journals (Sweden)

    C. E. M. Oliveira

    Full Text Available This work investigates the response of two reinforced concrete (RC plane frames after the loss of a column and their potential resistance for progressive collapse. Nonlinear dynamic analysis is performed using a multilayered Euler/Bernoulli beam element, including elasto-viscoplastic effects. The material nonlinearity is represented using one-dimensional constitutive laws in the material layers, while geometrical nonlinearities are incorporated within a corotational beam formulation. The frames were designed in accordance with the minimum requirements proposed by the reinforced concrete design/building codes of Europe (fib [1-2], Eurocode 2 [3] and Brazil (NBR 6118 [4]. The load combinations considered for PC analysis follow the prescriptions of DoD [5]. The work verifies if the minimum requirements of the considered codes are sufficient for enforcing structural safety and robustness, and also points out the major differences in terms of progressive collapse potential of the corresponding designed structures.

  20. Automated Techniques for Rapid Analysis of Momentum Exchange Devices

    Science.gov (United States)

    2013-12-01

    Contiguousness At this point, it is necessary to introduce the concept of contiguousness. In this thesis, a state space analysis representation is... concept of contiguousness was established to ensure that the results of the analysis would allow for the CMGs to reach every state in the defined...forces at the attachment points of the RWs and CMGs throughout a spacecraft maneuver. Current pedagogy on this topic focuses on the transfer of

  1. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  2. Analysis of the changes in keratoplasty indications and preferred techniques.

    Directory of Open Access Journals (Sweden)

    Stefan J Lang

    Full Text Available Recently, novel techniques introduced to the field of corneal surgery, e.g. Descemet membrane endothelial keratoplasty (DMEK and corneal crosslinking, extended the therapeutic options. Additionally contact lens fitting has developed new alternatives. We herein investigated, whether these techniques have affected volume and spectrum of indications of keratoplasties in both a center more specialized in treating Fuchs' dystrophy (center 1 and a second center that is more specialized in treating keratoconus (center 2.We retrospectively reviewed the waiting lists for indication, transplantation technique and the patients' travel distances to the hospital at both centers.We reviewed a total of 3778 procedures. Fuchs' dystrophy increased at center 1 from 17% (42 to 44% (150 and from 13% (27 to 23% (62 at center 2. In center 1, DMEK increased from zero percent in 2010 to 51% in 2013. In center 2, DMEK was not performed until 2013. The percentage of patients with keratoconus slightly decreased from 15% (36 in 2009 vs. 12% (40 in 2013 in center 1. The respective percentages in center 2 were 28% (57 and 19% (51. In both centers, the patients' travel distances increased.The results from center 1 suggest that DMEK might increase the total number of keratoplasties. The increase in travel distance suggests that this cannot be fully attributed to recruiting the less advanced patients from the hospital proximity. The increase is rather due to more referrals from other regions. The decrease of keratoconus patients in both centers is surprising and may be attributed to optimized contact lens fitting or even to the effect corneal crosslinking procedure.

  3. Improved Tandem Measurement Techniques for Aerosol Particle Analysis

    Science.gov (United States)

    Rawat, Vivek Kumar

    Non-spherical, chemically inhomogeneous (complex) nanoparticles are encountered in a number of natural and engineered environments, including combustion systems (which produces highly non-spherical aggregates), reactors used in gas-phase materials synthesis of doped or multicomponent materials, and in ambient air. These nanoparticles are often highly diverse in size, composition and shape, and hence require determination of property distribution functions for accurate characterization. This thesis focuses on development of tandem mobility-mass measurement techniques coupled with appropriate data inversion routines to facilitate measurement of two dimensional size-mass distribution functions while correcting for the non-idealities of the instruments. Chapter 1 provides the detailed background and motivation for the studies performed in this thesis. In chapter 2, the development of an inversion routine is described which is employed to determine two dimensional size-mass distribution functions from Differential Mobility Analyzer-Aerosol Particle Mass analyzer tandem measurements. Chapter 3 demonstrates the application of the two dimensional distribution function to compute cumulative mass distribution function and also evaluates the validity of this technique by comparing the calculated total mass concentrations to measured values for a variety of aerosols. In Chapter 4, this tandem measurement technique with the inversion routine is employed to analyze colloidal suspensions. Chapter 5 focuses on application of a transverse modulation ion mobility spectrometer coupled with a mass spectrometer to study the effect of vapor dopants on the mobility shifts of sub 2 nm peptide ion clusters. These mobility shifts are then compared to models based on vapor uptake theories. Finally, in Chapter 6, a conclusion of all the studies performed in this thesis is provided and future avenues of research are discussed.

  4. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  5. Combined Technique Analysis of Punic Make-up Materials

    Energy Technology Data Exchange (ETDEWEB)

    Huq,A.; Stephens, P.; Ayed, N.; Binous, H.; Burgio, L.; Clark, R.; Pantos, E.

    2006-01-01

    Ten archaeological Punic make-up samples from Tunisia dating from the 4th to the 1st centuries BC were analyzed by several techniques including Raman microscopy and synchrotron X-ray diffraction in order to determine their compositions. Eight samples were red and found to contain either quartz and cinnabar or quartz and haematite. The remaining two samples were pink, the main diffracting phase in them being quartz. Examination of these two samples by optical microscopy and by illumination under a UV lamp suggest that the pink dye is madder. These findings reveal the identities of the materials used by Carthaginians for cosmetic and/or ritual make-up purposes.

  6. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...

  7. Comparative Analysis of Data Mining Techniques for Malaysian Rainfall Prediction

    Directory of Open Access Journals (Sweden)

    Suhaila Zainudin

    2016-12-01

    Full Text Available Climate change prediction analyses the behaviours of weather for a specific time. Rainfall forecasting is a climate change task where specific features such as humidity and wind will be used to predict rainfall in specific locations. Rainfall prediction can be achieved using classification task under Data Mining. Different techniques lead to different performances depending on rainfall data representation including representation for long term (months patterns and short-term (daily patterns. Selecting an appropriate technique for a specific duration of rainfall is a challenging task. This study analyses multiple classifiers such as Naïve Bayes, Support Vector Machine, Decision Tree, Neural Network and Random Forest for rainfall prediction using Malaysian data. The dataset has been collected from multiple stations in Selangor, Malaysia. Several pre-processing tasks have been applied in order to resolve missing values and eliminating noise. The experimental results show that with small training data (10% from 1581 instances Random Forest correctly classified 1043 instances. This is the strength of an ensemble of trees in Random Forest where a group of classifiers can jointly beat a single classifier.

  8. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  9. Microscopy Techniques for Analysis of Nickel Metal Hydride Batteries Constituents.

    Science.gov (United States)

    Carpenter, Graham J C; Wronski, Zbigniew

    2015-12-01

    With the need for improvements in the performance of rechargeable batteries has come the necessity to better characterize cell electrodes and their component materials. Electron microscopy has been shown to reveal many important features of microstructure that are becoming increasingly important for understanding the behavior of the components during the many charge/discharge cycles required in modern applications. The aim of this paper is to present an overview of how the full suite of techniques available using transmission electron microscopy (TEM) and scanning transmission electron microscopy was applied to the case of materials for the positive electrode in nickel metal hydride rechargeable battery electrodes. Embedding and sectioning of battery-grade powders with an ultramicrotome was used to produce specimens that could be readily characterized by TEM. Complete electrodes were embedded after drying, and also after dehydration from the original wet state, for examination by optical microscopy and using focused ion beam techniques. Results of these studies are summarized to illustrate the significance of the microstructural information obtained.

  10. 色谱联用技术的研究进展%Research Progress of Hypenated Chromatographic Techniques

    Institute of Scientific and Technical Information of China (English)

    杨洁; 索习东

    2012-01-01

    综述了色谱联用技术的发展概况及其在各种研究中的应用和薪进展,同时指出随着色谱联用技术的不断发展,各种新型色谱技术在现代研究中将有更大的发展空间与应用前景。%This article summarized the development of hyphenated Chromatography would techniques and its applications and new developments in various studies. Meanwhile it indicated that there would be greater room and application prospects for new types of chromatographic techniques in modern researches with the continuous development of hyphenated Chromatographic techniques.

  11. Techniques of EMG signal analysis: detection, processing, classification and applications

    Science.gov (United States)

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  12. Finite Element Modeling Techniques for Analysis of VIIP

    Science.gov (United States)

    Feola, Andrew J.; Raykin, J.; Gleason, R.; Mulugeta, Lealem; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.; Ethier, C. Ross

    2015-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP.

  13. 紫外可见分光光度技术的应用进展%Recent Progress of Application of UV-Vis Spectrophotometric Technique

    Institute of Scientific and Technical Information of China (English)

    王海军; 宁新霞

    2012-01-01

    介绍了近几年来紫外可见分光光度技术在仪器部件、多组分体系的测定,新技术应用进展以及与其他技术联用等方面的情况(引用文献33篇)。%An introduction on the recent progress of application of UVVis spectrophotometric technique was presented covering mainly the last several years, relating especially to parts of instrument, determination of multi- component systems, application of new technology and it's hyphenation with other technologies (33 ref. cited).

  14. The Analysis of a Phobic Child: Some Problems of Theory and Technique in Child Analysis.

    Science.gov (United States)

    Bornstein, Berta

    2014-01-01

    This paper attempts to clarify some theoretical and technical aspects of child analysis by correlating the course of treatment, the structure of the neurosis, and the technique employed in the case of a phobic boy who was in analysis over a period of three years. The case was chosen for presentation: (1) because of the discrepancy between the clinical simplicity of the symptom and the complicated ego structure behind it; (2) because of the unusual clearness with which the patient brought to the fore the variegated patterns of his libidinal demands; (3) because of the patient's attempts at transitory solutions, oscillations between perversions and symptoms, and processes of new symptom formation; (4) because the vicissitudes and stabilization of character traits could be clearly traced; (5) and finally, because of the rare opportunity to witness during treatment the change from grappling with reality by means of pathological mechanisms, to dealing with reality in a relatively conflict-free fashion.

  15. Instrumental Neutron Activation Analysis Technique using Subsecond Radionuclides

    DEFF Research Database (Denmark)

    Nielsen, H.K.; Schmidt, J.O.

    1987-01-01

    The fast irradiation facility Mach-1 installed at the Danish DR 3 reactor has been used in boron determinations by means of Instrumental Neutron Activation Analysis using12B with 20-ms half-life. The performance characteristics of the system are presented and boron determinations of NBS standard...

  16. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... of the offeror's cost trends, on the basis of current and historical cost or pricing data; (C... the FAR looseleaf edition), Cost Accounting Standards. (v) Review to determine whether any cost data... required. (2) Price analysis shall be used when certified cost or pricing data are not required...

  17. Sentiment analysis of Arabic tweets using text mining techniques

    Science.gov (United States)

    Al-Horaibi, Lamia; Khan, Muhammad Badruddin

    2016-07-01

    Sentiment analysis has become a flourishing field of text mining and natural language processing. Sentiment analysis aims to determine whether the text is written to express positive, negative, or neutral emotions about a certain domain. Most sentiment analysis researchers focus on English texts, with very limited resources available for other complex languages, such as Arabic. In this study, the target was to develop an initial model that performs satisfactorily and measures Arabic Twitter sentiment by using machine learning approach, Naïve Bayes and Decision Tree for classification algorithms. The datasets used contains more than 2,000 Arabic tweets collected from Twitter. We performed several experiments to check the performance of the two algorithms classifiers using different combinations of text-processing functions. We found that available facilities for Arabic text processing need to be made from scratch or improved to develop accurate classifiers. The small functionalities developed by us in a Python language environment helped improve the results and proved that sentiment analysis in the Arabic domain needs lot of work on the lexicon side.

  18. Radio & Optical Interferometry: Basic Observing Techniques and Data Analysis

    CERN Document Server

    Monnier, John D

    2012-01-01

    Astronomers usually need the highest angular resolution possible, but the blurring effect of diffraction imposes a fundamental limit on the image quality from any single telescope. Interferometry allows light collected at widely-separated telescopes to be combined in order to synthesize an aperture much larger than an individual telescope thereby improving angular resolution by orders of magnitude. Radio and millimeter wave astronomers depend on interferometry to achieve image quality on par with conventional visible and infrared telescopes. Interferometers at visible and infrared wavelengths extend angular resolution below the milli-arcsecond level to open up unique research areas in imaging stellar surfaces and circumstellar environments. In this chapter the basic principles of interferometry are reviewed with an emphasis on the common features for radio and optical observing. While many techniques are common to interferometers of all wavelengths, crucial differences are identified that will help new practi...

  19. Analysis of ultrasonic techniques for monitoring milk coagulation during cheesemaking

    Science.gov (United States)

    Budelli, E.; Pérez, N.; Lema, P.; Negreira, C.

    2012-12-01

    Experimental determination of time of flight and attenuation has been proposed in the literature as alternatives to monitoring the evolution of milk coagulation during cheese manufacturing. However, only laboratory scale procedures have been described. In this work, the use of ultrasonic time of flight and attenuation to determine cutting time and its feasibility to be applied at industrial scale were analyzed. Limitations to implement these techniques at industrial scale are shown experimentally. The main limitation of the use of time of flight is its strong dependence with temperature. Attenuation monitoring is affected by a thin layer of milk skin covering the transducer, which modifies the signal in a non-repetitive way. The results of this work can be used to develop alternative ultrasonic systems suitable for application in the dairy industry.

  20. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    Science.gov (United States)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  1. A genetic analysis of Adh1 regulation. Progress report, June 1991--February 1992

    Energy Technology Data Exchange (ETDEWEB)

    Freeling, M.

    1992-03-01

    The overall goal of our research proposal is to understand the meaning of the various cis-acting sites responsible for AdH1 expression in the entire maize plant. Progress is reported in the following areas: Studies on the TATA box and analysis of revertants of the Adh1-3F1124 allele; screening for more different mutants that affect Adh1 expression differentially; studies on cis-acting sequences required for root-specific Adh1 expression; refinement of the use of the particle gun; and functional analysis of a non- glycolytic anaerobic protein.

  2. Radiative neutron capture as a counting technique at pulsed spallation neutron sources: a review of current progress

    Science.gov (United States)

    Schooneveld, E. M.; Pietropaolo, A.; Andreani, C.; Perelli Cippo, E.; Rhodes, N. J.; Senesi, R.; Tardocchi, M.; Gorini, G.

    2016-09-01

    Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources.

  3. Progress in the development of deposition prevention and cleaning techniques of in-vessel optics in ITER

    Science.gov (United States)

    Mukhin, E.; Vukolov, K.; Semenov, V.; Tolstyakov, S.; Kochergin, M.; Kurskiev, G.; Podushnikova, K.; Razdobarin, A.; Gorodetsky, A.; Zalavutdinov, R.; Bukhovets, V.; Zakharov, A.; Bulovich, S.; Veiko, V.; Shakshno, E.

    2009-08-01

    The lifetime of front optical components unprotected from reactor grade plasmas may be very short due to intensive contamination with carbon and beryllium-based materials eroded by the plasma from beryllium walls and carbon tiles. Deposits result in a significant reduction and spectral alterations of optical transmission. In addition, even rather thin and transparent deposits can dramatically change the shape of reflectance spectra, especially for mirrors with rather low reflectivity, such as W or Mo. The distortion of data obtained with various optical diagnostics may affect the safe operation of ITER. Therefore, the development of optics-cleaning and deposition-mitigating techniques is a key factor in the construction and operation of optical diagnostics in ITER. The problem is of particular concern for optical elements positioned in the divertor region. The latest achievements in protection of in-vessel optics are presented using the example of deposition prevention/cleaning techniques for in-machine components of the Thomson scattering system in the divertor. Careful consideration of well-known and novel protection approaches shows that neither of them alone provides guaranteed survivability of the first in-vessel optics in the divertor. Only a set of complementary prevention/cleaning techniques, which include special materials for mirrors and inhibition additives for plasma, is able to manage the challenging task. The essential issue, which needs to be addressed in the immediate future, is an extensive development of techniques tested under experimental conditions (exposure time and contamination fluxes) similar to those expected in ITER.

  4. Determination of Volatile Organic Compounds in the Atmosphere Using Two Complementary Analysis Techniques.

    Science.gov (United States)

    Alonso, L; Durana, N; Navazo, M; García, J A; Ilardia, J L

    1999-08-01

    During a preliminary field campaign of volatile organic compound (VOC) measurements carried out in an urban area, two complementary analysis techniques were applied to establish the technical and scientific bases for a strategy to monitor and control VOCs and photochemical oxidants in the Autonomous Community of the Basque Country. Integrated sampling was conducted using Tenax sorbent tubes and laboratory analysis by gas chromatography, and grab sampling and in situ analysis also were conducted using a portable gas chromatograph. With the first technique, monocyclic aromatic hydrocarbons appeared as the compounds with the higher mean concentrations. The second technique allowed the systematic analysis of eight chlorinated and aromatic hydrocarbons. Results of comparing both techniques, as well as the additional information obtained with the second technique, are included.

  5. Increased oil production and reserves from improved completion techniques in the Bluebell Field, Uinta Basin, Utah. Seventh quarterly technical progress report, April 1, 1995--June 30, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, C.D.

    1995-09-01

    The objective of this project is to increase oil production and reserves in the Uinta Basin by demonstrating improved completion techniques. Low productivity of Uinta Basin wells is caused by gross production intervals of several thousand feet that contain perforated thief zones, water-bearing zones, and unperforated oil-bearing intervals. Geologic and engineering characterization and computer simulation of the Green River and Wasatch formations in the Bluebell field will determine reservoir heterogeneities related to fractures and depositional trends. This will be followed by drilling and recompletion of several wells to demonstrate improved completion techniques based on the reservoir characterization. Transfer of the project results will be an ongoing component of the project. Technical progress for this quarter are discussed for subsurface and engineering studies.

  6. POC-scale testing of an advanced fine coal dewatering equipment/technique. Quarterly technical progress report 2, January 1995--March 1995

    Energy Technology Data Exchange (ETDEWEB)

    Groppo, J.G.; Parekh, B.K.

    1995-05-05

    Froth flotation technique is an effective and efficient process for recovering of ultra-fine (minus 74 {mu}m) clean coal. Economical dewatering of an ultra-fine clean coal product to a 20 percent level moisture will be an important step in successful implementation of the advanced cleaning processes. This project is a step in the Department of Energy`s program to show that ultra-clean coal could be effectively dewatered to 20 percent or lower moisture using either conventional or advanced dewatering techniques. The cost-sharing contract effort is for 36 months beginning September 30, 1994. This report discusses technical progress made during the quarter from January 1 to March 31, 1995.

  7. Transient analysis techniques in performing impact and crash dynamic studies

    Science.gov (United States)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  8. Biomechanical analysis technique choreographic movements (for example, "grand battman jete"

    Directory of Open Access Journals (Sweden)

    Batieieva N.P.

    2015-04-01

    Full Text Available Purpose : biomechanical analysis of the execution of choreographic movement "grand battman jete". Material : the study involved students (n = 7 of the department of classical choreography faculty of choreography. Results : biomechanical analysis of choreographic movement "grand battman jete" (classic exercise, obtained kinematic characteristics (path, velocity, acceleration, force of the center of mass (CM bio parts of the body artist (foot, shin, thigh. Built bio kinematic model (phase. The energy characteristics - mechanical work and kinetic energy units legs when performing choreographic movement "grand battman jete". Conclusions : It was found that the ability of an athlete and coach-choreographer analyze the biomechanics of movement has a positive effect on the improvement of choreographic training of qualified athletes in gymnastics (sport, art, figure skating and dance sports.

  9. Elemental analysis of silver coins by PIXE technique

    Energy Technology Data Exchange (ETDEWEB)

    Tripathy, B.B. [Department of Physics, Silicon Institute of Technology, Patia, Bhubaneswar 751 024 (India); Rautray, Tapash R. [Department of Dental Biomaterials, School of Dentistry, Kyungpook National University, 2-188-1 Samduk -dong, Jung-gu, Daegu 700 412 (Korea, Republic of); ARASMIN, G. Udayagiri, Kandhamal, Orissa 762 100 (India)], E-mail: tapash.rautray@gmail.com; Rautray, A.C. [ARASMIN, G. Udayagiri, Kandhamal, Orissa 762 100 (India); Vijayan, V. [Praveen Institute of Radiation Technology, Flat No. 9A, Avvai Street, New Perungalathur, Chennai 600 063 (India)

    2010-03-15

    Elemental analysis of nine Indian silver coins during British rule was carried out by proton induced X-ray emission spectroscopy. Eight elements, namely Cr, Fe, Ni, Cu, Zn, As, Ag, and Pb were determined in the present study. Ag and Cu were found to be the major elements, Zn was the only minor element and all other elements are present at the trace level. The variation of the elemental concentration may be due to the use of different ores for making coins.

  10. Stalked protozoa identification by image analysis and multivariable statistical techniques

    OpenAIRE

    Amaral, A.L.; Ginoris, Y. P.; Nicolau, Ana; M.A.Z. Coelho; Ferreira, E. C.

    2008-01-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determinin...

  11. Impact of HIV type 1 DNA levels on spontaneous disease progression: a meta-analysis.

    Science.gov (United States)

    Tsiara, Chrissa G; Nikolopoulos, Georgios K; Bagos, Pantelis G; Goujard, Cecile; Katzenstein, Terese L; Minga, Albert K; Rouzioux, Christine; Hatzakis, Angelos

    2012-04-01

    Several studies have reported the prognostic strength of HIV-1 DNA with variable results however. The aims of the current study were to estimate more accurately the ability of HIV-1 DNA to predict progression of HIV-1 disease toward acquired immunodeficiency syndrome (AIDS) or death, and to compare the prognostic information obtained by HIV-1 DNA with that derived from plasma HIV-1 RNA. Eligible articles were identified through a comprehensive search of Medline, ISI Web of Science, Scopus, and Google Scholar. The analysis included univariate and bivariate random-effects models. The univariate meta-analysis of six studies involving 1074 participants showed that HIV-1 DNA was a strong predictive marker of AIDS [relative risk (RR): 3.01, 95% confidence interval (CI): 1.88-4.82] and of all-cause mortality (RR: 3.49, 95% CI: 2.06-5.89). The bivariate model using the crude estimates of primary studies indicated that HIV-1 DNA was a significantly better predictor than HIV-1 RNA of either AIDS alone (ratio of RRs=1.47, 95% CI: 1.05-2.07) or of combined (AIDS or death) progression outcomes (ratio of RRs=1.51, 95% CI: 1.11-2.05). HIV-1 DNA is a strong predictor of HIV-1 disease progression. Moreover, there is some evidence that HIV-1 DNA might have better predictive value than plasma HIV-1 RNA.

  12. Skills and Vacancy Analysis with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Izabela A. Wowczko

    2015-11-01

    Full Text Available Through recognizing the importance of a qualified workforce, skills research has become one of the focal points in economics, sociology, and education. Great effort is dedicated to analyzing labor demand and supply, and actions are taken at many levels to match one with the other. In this work we concentrate on skills needs, a dynamic variable dependent on many aspects such as geography, time, or the type of industry. Historically, skills in demand were easy to evaluate since transitions in that area were fairly slow, gradual, and easy to adjust to. In contrast, current changes are occurring rapidly and might take an unexpected turn. Therefore, we introduce a relatively simple yet effective method of monitoring skills needs straight from the source—as expressed by potential employers in their job advertisements. We employ open source tools such as RapidMiner and R as well as easily accessible online vacancy data. We demonstrate selected techniques, namely classification with k-NN and information extraction from a textual dataset, to determine effective ways of discovering knowledge from a given collection of vacancies.

  13. Ionospheric Behaviour Analysis over Thailand Using Radio Occultation Technique.

    Directory of Open Access Journals (Sweden)

    Ahmed Wasiu Akande

    2015-11-01

    Full Text Available With the advent in the development of science and technology in the field of space and atmospheric science in order to obtain accurate result, hence the use of radio occultation technique in the investigation of the amount of electron density and Total Electron Content presence in equatorial region particularly over Thailand. In this research, radio occultation data obtained from UCAR/CDAAC was used to observe daily, monthly, seasonal and the entire year 2013 Ionospheric TEC and electron density variation due to changes and instability of solar activities from time to time. It was observed that TEC was high (ionosphere was more disturbed or violent in May and spread over a wide range of altitude and summer season has the highest TEC value for the year 2013 which means at this period GNSS measurements was more prone to error. It was noted that ionospheric variations or fluctuations was maximum between 200km and 450km altitude. The results of the study show that ionospheric perturbation effects or irregularities depend on season and solar activity.

  14. Manure management and greenhouse gas mitigation techniques : a comparative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Langmead, C.

    2003-09-03

    Alberta is the second largest agricultural producer in Canada, ranking just behind Ontario. Approximately 62 per cent of the province's farm cash receipts are attributable to the livestock industry. Farmers today maintain large numbers of a single animal type. The drivers for more advanced manure management systems include: the trend towards confined feeding operations (CFO) is creating large, concentrated quantities of manure; public perception of CFO; implementation of provincial legislation regulating the expansion and construction of CFO; ratification of the Kyoto Protocol raised interest in the development of improved manure management systems capable of reducing greenhouse gas (GHG) emissions; and rising energy costs. The highest methane emissions factors are found with liquid manure management systems. They contribute more than 80 per cent of the total methane emissions from livestock manure in Alberta. The author identified and analyzed three manure management techniques to mitigate GHG emissions. They were: bio-digesters, gasification systems, and composting. Three recommendations were made to establish a strategy to support emissions offsets and maximize the reduction of methane emissions from the livestock industry. The implementation of bio-digesters, especially for the swine industry, was recommended. It was suggested that a gasification pilot project for poultry manure should be pursued by Climate Change Central. Public outreach programs promoting composting of cattle manure for beef feedlots and older style dairy barns should also be established. 19 refs., 11 tabs., 3 figs.

  15. Chromatographic finger print analysis of Naringi crenulata by HPTLC technique

    Institute of Scientific and Technical Information of China (English)

    Subramanian Sampathkumar; Ramakrishnan N

    2011-01-01

    Objective:To establish the fingerprint profile of Naringi crenulata (N. crenulata) (Roxb.) Nicols. using high performance thin layer chromatography (HPTLC) technique. Methods: Preliminary phytochemical screening was done and HPTLC studies were carried out. CAMAG HPTLC system equipped with Linomat V applicator, TLC scanner 3, Reprostar 3 and WIN CATS-4 software was used. Results: The results of preliminary phytochemical studies confirmed the presence of protein, lipid, carbohydrate, reducing sugar, phenol, tannin, flavonoid, saponin, triterpenoid, alkaloid, anthraquinone and quinone. HPTLC finger printing of ethanolic extract of stem revealed 10 spots with Rf values in the range of 0.08 to 0.65;bark showed 8 peaks with Rf values in the range of 0.07 to 0.63 and the ethanol extract of leaf revealed 8 peaks with Rf values in the range of 0.09 to 0.49, respectively. The purity of sample was confirmed by comparing the absorption spectra at start, middle and end position of the band. Conclusions:It can be concluded that HPTLC finger printing of N. crenulata may be useful in differentiating the species from the adulterant and act as a biochemical marker for this medicinally important plant in the pharmaceutical industry and plant systematic studies.

  16. Comparative Analysis of Automatic Vehicle Classification Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Kanwal Yousaf

    2012-09-01

    Full Text Available Vehicle classification has emerged as a significant field of study because of its importance in variety of applications like surveillance, security system, traffic congestion avoidance and accidents prevention etc. So far numerous algorithms have been implemented for classifying vehicle. Each algorithm follows different procedures for detecting vehicles from videos. By evaluating some of the commonly used techniques we highlighted most beneficial methodology for classifying vehicles. In this paper we pointed out the working of several video based vehicle classification algorithms and compare these algorithms on the basis of different performance metrics such as classifiers, classification methodology or principles and vehicle detection ratio etc. After comparing these parameters we concluded that Hybrid Dynamic Bayesian Network (HDBN Classification algorithm is far better than the other algorithms due to its nature of estimating the simplest features of vehicles from different videos. HDBN detects vehicles by following important stages of feature extraction, selection and classification. It extracts the rear view information of vehicles rather than other information such as distance between the wheels and height of wheel etc.

  17. Preliminary Analysis of ULPC Light Curves Using Fourier Decomposition Technique

    CERN Document Server

    Ngeow, Chow-Choong; Kanbur, Shashi; Barrett, Brittany; Lin, Bin

    2013-01-01

    Recent work on Ultra Long Period Cepheids (ULPCs) has suggested their usefulness as a distance indicator, but has not commented on their relationship as compared with other types of variable stars. In this work, we use Fourier analysis to quantify the structure of ULPC light curves and compare them to Classical Cepheids and Mira variables. Our preliminary results suggest that the low order Fourier parameters of ULPCs show a continuous trend defined by Classical Cepheids after the resonance around 10 days. However their Fourier parameters also overlapped with those from Miras, which make the classification of long period variable stars difficult based on the light curves information alone.

  18. [THE COMPARATIVE ANALYSIS OF TECHNIQUES OF IDENTIFICATION OF CORYNEBACTERIUM NON DIPHTHERIAE].

    Science.gov (United States)

    Kharseeva, G G; Voronina, N A; Mironov, A Yu; Alutina, E L

    2015-12-01

    The comparative analysis was carried out concerning effectiveness of three techniques of identification of Corynebacterium non diphtheriae: bacteriological, molecular genetic (sequenation on 16SpRNA) andmass-spectrometric (MALDI-ToFMS). The analysis covered 49 strains of Corynebacterium non diphtheriae (C.pseudodiphheriticum, C.amycolatum, C.propinquum, C.falsenii) and 2 strains of Corynebacterium diphtheriae isolated under various pathology form urogenital tract and upper respiratory ways. The corinbacteria were identified using bacteriologic technique, sequenation on 16SpRNA and mass-spectrometric technique (MALDIToF MS). The full concordance of results of species' identification was marked in 26 (51%) of strains of Corynebacterium non diphtheriae at using three analysis techniques; in 43 (84.3%) strains--at comparison of bacteriologic technique with sequenation on 16S pRNA and in 29 (57%)--at mass-spectrometric analysis and sequenation on 16S pRNA. The bacteriologic technique is effective for identification of Corynebacterium diphtheriae. The precise establishment of species belonging of corynebacteria with variable biochemical characteristics the molecular genetic technique of analysis is to be applied. The mass-spectrometric technique (MALDI-ToF MS) requires further renewal of data bases for identifying larger spectrum of representatives of genus Corynebacterium.

  19. Multi-platform genome-wide analysis of melanoma progression to brain metastasis

    Directory of Open Access Journals (Sweden)

    Diego M. Marzese

    2014-12-01

    Full Text Available Melanoma has a high tendency to metastasize to brain tissue. The understanding about the molecular alterations of early-stage melanoma progression to brain metastasis (MBM is very limited. Identifying MBM-specific genomic and epigenomic alterations is a key initial step in understanding its aggressive nature and identifying specific novel druggable targets. Here, we describe a multi-platform dataset generated with different stages of melanoma progression to MBM. This data includes genome-wide DNA methylation (Illumina HM450K BeadChip, gene expression (Affymetrix HuEx 1.0 ST array, single nucleotide polymorphisms (SNPs and copy number variation (CNV; Affymetrix SNP 6.0 array analyses of melanocyte cells (MNCs, primary melanoma tumors (PRMs, lymph node metastases (LNMs and MBMs. The analysis of this data has been reported in our recently published study (Marzese et al., 2014.

  20. Progress of a cross-correlation based optical strain measurement technique for detecting radial growth on a rotating disk

    Science.gov (United States)

    Clem, Michelle M.; Woike, Mark R.; Abdul-Aziz, Ali

    2014-04-01

    The Aeronautical Sciences Project under NASA's Fundamental Aeronautics Program is interested in the development of novel measurement technologies, such as optical surface measurements for the in situ health monitoring of critical constituents of the internal flow path. In situ health monitoring has the potential to detect flaws, i.e. cracks in key components, such as engine turbine disks, before the flaws lead to catastrophic failure. The present study, aims to further validate and develop an optical strain measurement technique to measure the radial growth and strain field of an already cracked disk, mimicking the geometry of a sub-scale turbine engine disk, under loaded conditions in the NASA Glenn Research Center's High Precision Rotordynamics Laboratory. The technique offers potential fault detection by imaging an applied high-contrast random speckle pattern under unloaded and loaded conditions with a CCD camera. Spinning the cracked disk at high speeds (loaded conditions) induces an external load, resulting in a radial growth of the disk of approximately 50.0-μm in the flawed region and hence, a localized strain field. When imaging the cracked disk under static conditions, the disk will be undistorted; however, during rotation the cracked region will grow radially, thus causing the applied particle pattern to be `shifted'. The resulting particle displacements between the two images is measured using the two-dimensional cross-correlation algorithms implemented in standard Particle Image Velocimetry (PIV) software to track the disk growth, which facilitates calculation of the localized strain field. A random particle distribution is adhered onto the surface of the cracked disk and two bench top experiments are carried out to evaluate the technique's ability to measure the induced particle displacements. The disk is shifted manually using a translation stage equipped with a fine micrometer and a hotplate is used to induce thermal growth of the disk, causing the

  1. HPLC-MS technique for radiopharmaceuticals analysis and quality control

    Science.gov (United States)

    Macášek, F.; Búriová, E.; Brúder, P.; Vera-Ruiz, H.

    2003-01-01

    Potentialities of liquid chromatography with mass spectrometric detector (MSD) were investigated with the objective of quality control of radiopharmaceuticals; 2-deoxy-2-[18F]fluoro-D-glucose (FDG) being an example. Screening of suitable MSD analytical lines is presented. Mass-spectrometric monitoring of acetonitrile— aqueous ammonium formate eluant by negatively charged FDG.HCO2 - ions enables isotope analysis (specific activity) of the radiopharmaceutical at m/z 227 and 226. Kryptofix® 222 provides an intense MSD signal of the positive ion associated with NH4 + at m/z 394. Expired FDG injection samples contain decomposition products from which at least one labelled by 18F and characterised by signal of negative ions at m/z 207 does not correspond to FDG fragments but to C5 decomposition products. A glucose chromatographic peak, characterised by m/z 225 negative ion is accompanied by a tail of a component giving a signal of m/z 227, which can belong to [18O]glucose; isobaric sorbitol signals were excluded but FDG-glucose association occurs in the co-elution of separation of model mixtures. The latter can actually lead to a convoluted chromatographic peak, but the absence of 18F makes this inconsistent. Quantification and validation of the FDG component analysis is under way.

  2. Structural Analysis of Composite Laminates using Analytical and Numerical Techniques

    Directory of Open Access Journals (Sweden)

    Sanghi Divya

    2016-01-01

    Full Text Available A laminated composite material consists of different layers of matrix and fibres. Its properties can vary a lot with each layer’s or ply’s orientation, material property and the number of layers itself. The present paper focuses on a novel approach of incorporating an analytical method to arrive at a preliminary ply layup order of a composite laminate, which acts as a feeder data for the further detailed analysis done on FEA tools. The equations used in our MATLAB are based on analytical study code and supply results that are remarkably close to the final optimized layup found through extensive FEA analysis with a high probabilistic degree. This reduces significant computing time and saves considerable FEA processing to obtain efficient results quickly. The result output by our method also provides the user with the conditions that predicts the successive failure sequence of the composite plies, a result option which is not even available in popular FEM tools. The predicted results are further verified by testing the laminates in the laboratory and the results are found in good agreement.

  3. Analysis of compressive fracture in rock using statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  4. Structural analysis of irradiated crotoxin by spectroscopic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina C. de; Fucase, Tamara M.; Silva, Ed Carlos S. e; Chagas, Bruno B.; Buchi, Alisson T.; Viala, Vincent L.; Spencer, Patrick J.; Nascimento, Nanci do, E-mail: kcorleto@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Biotecnologia

    2013-07-01

    Snake bites are a serious public health problem, especially in subtropical countries. In Brazil, the serum, the only effective treatment in case of snake bites, is produced in horses which, despite of their large size, have a reduced lifespan due to the high toxicity of the antigen. Ionizing radiation has been successfully employed to attenuate the biological activity of animal toxins. Crotoxin, the main toxic compound from Crotalus durissus terrificus (Cdt), is a heterodimeric protein composed of two subunits: crotapotin and phospholipase A{sub 2}. Previous data indicated that this protein, following irradiation process, undergoes unfolding and/or aggregation, resulting in a much lower toxic antigen. The exact mechanisms and structural modifications involved in aggregation process are not clear yet. This work investigates the effects of ionizing radiation on crotoxin employing Infrared Spectroscopy, Circular Dichroism and Dynamic Light Scattering techniques. The infrared spectrum of lyophilized crotoxin showed peaks corresponding to the vibrational spectra of the secondary structure of crotoxin, including β-sheet, random coil, α-helix and β-turns. We calculated the area of these spectral regions after adjusting for baseline and normalization using the amide I band (1590-1700 cm{sup -1}), obtaining the variation of secondary structures of the toxin following irradiation. The Circular Dichroism spectra of native and irradiated crotoxin suggests a conformational change within the molecule after the irradiation process. This data indicates structural changes between the samples, apparently from ordered conformation towards a random coil. The analyses by light scattering indicated that the irradiated crotoxin formed multimers with an average molecular radius 100 folds higher than the native toxin. (author)

  5. Progressive failure analysis of slope with strain-softening behaviour based on strength reduction method

    Institute of Scientific and Technical Information of China (English)

    Ke ZHANG; Ping CAO; Rui BAO

    2013-01-01

    Based on the strength reduction method and strain-softening model,a method for progressive failure analysis of strain-softening slopes was presented in this paper.The mutation is more pronounced in strain-softening analysis,and the mutation of displacement at slope crest was taken as critical failure criterion.An engineering example was provided to demonstrate the validity of the present method.This method was applied to a cut slope in an industry site.The results are as follows: (1) The factor of safety and the critical slip surface obtained by the present method are between those by peak and residual strength.The analysis with peak strength would lead to non-conservative results,but that with residual strength tends to be overly conservative.(2) The thickness of the shear zone considering strain-softening behaviour is narrower than that with non-softening analysis.(3) The failure of slope is the process of the initiation,propagation and connection of potential failure surface.The strength parameters are mobilized to a non-uniform degree while progressive failure occurs in the slope.(4) The factor of safety increases with the increase of residual shear strain threshold and elastic modulus.The failure mode of slope changes from shallow slip to deep slip.Poisson's ratio and dilation angle have little effect on the results.

  6. Multidimensional Analysis of Quenching: Comparison of Inverse Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dowding, K.J.

    1998-11-18

    Understanding the surface heat transfer during quenching can be beneficial. Analysis to estimate the surface heat transfer from internal temperature measurements is referred to as the inverse heat conduction problem (IHCP). Function specification and gradient adjoint methods, which use a gradient search method coupled with an adjoint operator, are widely u led methods to solve the IHCP. In this paper the two methods are presented for the multidimensional case. The focus is not a rigorous comparison of numerical results. Instead after formulating the multidimensional solutions, issues associated with the numerical implementation and practical application of the methods are discussed. In addition, an experiment that measured the surface heat flux and temperatures for a transient experimental case is analyzed. Transient temperatures are used to estimate the surface heat flux, which is compared to the measured values. The estimated surface fluxes are comparable for the two methods.

  7. Nonlinear systems techniques for dynamical analysis and control

    CERN Document Server

    Lefeber, Erjen; Arteaga, Ines

    2017-01-01

    This treatment of modern topics related to the control of nonlinear systems is a collection of contributions celebrating the work of Professor Henk Nijmeijer and honoring his 60th birthday. It addresses several topics that have been the core of Professor Nijmeijer’s work, namely: the control of nonlinear systems, geometric control theory, synchronization, coordinated control, convergent systems and the control of underactuated systems. The book presents recent advances in these areas, contributed by leading international researchers in systems and control. In addition to the theoretical questions treated in the text, particular attention is paid to a number of applications including (mobile) robotics, marine vehicles, neural dynamics and mechanical systems generally. This volume provides a broad picture of the analysis and control of nonlinear systems for scientists and engineers with an interest in the interdisciplinary field of systems and control theory. The reader will benefit from the expert participan...

  8. A Generalized Lanczos-QR Technique for Structural Analysis

    DEFF Research Database (Denmark)

    Vissing, S.

    systems with very special properties. Due to the finite discretization the matrices are sparse and a relatively large number of problems also has real and symmetric matrices. The matrix equation for an undamped vibration contains two matrices describing tangent stiffness and mass distributions....... Alternatively, in a stability analysis, tangent stiffness and geometric stiffness matrices are introduced into an eigenvalue problem used to determine possible bifurcation points. The common basis for these types of problems is that the matrix equation describing the problem contains two real, symmetric......Within the field of solid mechanics such as structural dynamics and linearized as well as non-linear stability, the eigenvalue problem plays an important role. In the class of finite element and finite difference discretized problems these engineering problems are characterized by large matrix...

  9. Denial of Service Attack Techniques: Analysis, Implementation and Comparison

    Directory of Open Access Journals (Sweden)

    Khaled Elleithy

    2005-02-01

    Full Text Available A denial of service attack (DOS is any type of attack on a networking structure to disable a server from servicing its clients. Attacks range from sending millions of requests to a server in an attempt to slow it down, flooding a server with large packets of invalid data, to sending requests with an invalid or spoofed IP address. In this paper we show the implementation and analysis of three main types of attack: Ping of Death, TCP SYN Flood, and Distributed DOS. The Ping of Death attack will be simulated against a Microsoft Windows 95 computer. The TCP SYN Flood attack will be simulated against a Microsoft Windows 2000 IIS FTP Server. Distributed DOS will be demonstrated by simulating a distribution zombie program that will carry the Ping of Death attack. This paper will demonstrate the potential damage from DOS attacks and analyze the ramifications of the damage.

  10. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, es-pecially in view of the enormous amount of information available in computer-based supervision systems...... in some detail. Finally we address the problem of where to put the dot and the lines: when all information is ‘on the table’, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose...... was to develop a software tool for maintenance supervision of components in a nuclear power plant....

  11. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  12. Connectomic analysis of brain networks: novel techniques and future directions

    Directory of Open Access Journals (Sweden)

    Leonie Cazemier

    2016-11-01

    Full Text Available Brain networks, localized or brain-wide, exist only at the cellular level, i.e. between specific pre- and postsynaptic neurons, which are connected through functionally diverse synapses located at specific points of their cell membranes. Connectomics is the emerging subfield of neuroanatomy explicitly aimed at elucidating the wiring of brain networks with cellular resolution and a quantified accuracy. Such data are indispensable for realistic modeling of brain circuitry and function. A connectomic analysis, therefore, needs to identify and measure the soma, dendrites, axonal path and branching patterns together with the synapses and gap junctions of the neurons involved in any given brain circuit or network. However, because of the submicron caliber, 3D complexity and high packing density of most such structures, as well as the fact that axons frequently extend over long distances to make synapses in remote brain regions, creating connectomic maps is technically challenging and requires multi-scale approaches, Such approaches involve the combination of the most sensitive cell labeling and analysis methods available, as well as the development of new ones able to resolve individual cells and synapses with increasing high-throughput. In this review, we provide an overview of recently introduced high-resolution methods, which researchers wanting to enter the field of connectomics may consider. It includes several molecular labeling tools, some of which specifically label synapses, and covers a number of novel imaging tools such as brain clearing protocols and microscopy approaches. Apart from describing the tools, we also provide an assessment of their qualities. The criteria we use assess the qualities that tools need in order to contribute to deciphering the key levels of circuit organization. We conclude with a brief future outlook for neuroanatomic research, computational methods and network modeling, where we also point out several outstanding

  13. Genome sequencing and analysis conferences. Progress report, August 15, 1993--August 15, 1994

    Energy Technology Data Exchange (ETDEWEB)

    Venter, J.C.

    1995-10-01

    The 14 plenary session presentations focused on nematode; yeast; fruit fly; plants; mycobacteria; and man. In addition there were presentations on a variety of technical innovations including database developments and refinements, bioelectronic genesensors, computer-assisted multiplex techniques, and hybridization analysis with DNA chip technology. This document includes only the session schedule.

  14. LEMPEL - ZIV - WELCH & HUFFMAN” - THE LOSSLESS COMPRESSION TECHNIQUES; (IMPLEMENTATION ANALYSIS AND COMPARISON THEREOF)

    OpenAIRE

    Kapil Kapoor*, Dr. Abhay Sharma

    2016-01-01

    This paper is about the Implementation Analysis and Comparison of Lossless Compression Techniques viz. Lempel-Ziv-Welch and Huffman. LZW technique assigns fixed length code words. It requires no prior information about the probability of occurrence of symbols to be encoded. Basic idea in Huffman technique is that different gray levels occur with different probability (non-uniform- '•histogram). It uses shorter code words for the more common gray levels and longer code words for the l...

  15. Progress as Compositional Lock-Freedom

    DEFF Research Database (Denmark)

    Carbone, Marco; Dardha, Ornela; Montesi, Fabrizio

    2014-01-01

    such definition to capture a more intuitive notion of context adequacy for checking progress. Interestingly, our new catalysers lead to a novel characterisation of progress in terms of the standard notion of lock-freedom. Guided by this discovery, we also develop a conservative extension of catalysers that does...... not depend on types, generalising the notion of progress to untyped session-based processes. We combine our results with existing techniques for lock-freedom, obtaining a new methodology for proving progress. Our methodology captures new processes wrt previous progress analysis based on session types....

  16. Comparison of qualitative and quantitative analysis of T2-weighted MRI scans in chronic-progressive multiple sclerosis

    Science.gov (United States)

    Adams, Hans-Peter; Wagner, Simone; Koziol, James A.

    1998-06-01

    Magnetic resonance imaging (MRI) is routinely used for the diagnosis of multiple sclerosis (MS), and for objective assessment of the extent of disease as a marker of treatment efficacy in MS clinical trials. The purpose of this study is to compare the evaluation of T2-weighted MRI scans in MS patients using a semi-automated quantitative technique with an independent assessment by a neurologist. Baseline, 6- month, and 12-month T2-weighted MRI scans from 41 chronic progressive MS patients were examined. The lesion volume ranged from 0.50 to 51.56 cm2 (mean: 8.08 cm2). Reproducibility of the quantitative technique was assessed by the re-evaluation of a random subset of 20 scans, the coefficient of variation of the replicate determinations was 8.2%. The reproducibility of the neurologist evaluations was assessed by the re-evaluation of a random subset of 10 patients. The rank correlation between the results of the two methods was 0.097, which did not significantly differ from zero. Disease-related activity in T2-weighted MRI scans is a multi-dimensional construct, and is not adequately summarized solely by determination of lesion volume. In this setting, image analysis software should not only support storage and retrieval as sets of pixels, but should also support links to an anatomical dictionary.

  17. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  18. The application of emulation techniques in the analysis of highly reliable, guidance and control computer systems

    Science.gov (United States)

    Migneault, Gerard E.

    1987-01-01

    Emulation techniques can be a solution to a difficulty that arises in the analysis of the reliability of guidance and control computer systems for future commercial aircraft. Described here is the difficulty, the lack of credibility of reliability estimates obtained by analytical modeling techniques. The difficulty is an unavoidable consequence of the following: (1) a reliability requirement so demanding as to make system evaluation by use testing infeasible; (2) a complex system design technique, fault tolerance; (3) system reliability dominated by errors due to flaws in the system definition; and (4) elaborate analytical modeling techniques whose precision outputs are quite sensitive to errors of approximation in their input data. Use of emulation techniques for pseudo-testing systems to evaluate bounds on the parameter values needed for the analytical techniques is then discussed. Finally several examples of the application of emulation techniques are described.

  19. Financial planning and analysis techniques of mining firms: a note on Canadian practice

    Energy Technology Data Exchange (ETDEWEB)

    Blanco, H.; Zanibbi, L.R. (Laurentian University, Sudbury, ON (Canada). School of Commerce and Administration)

    1992-06-01

    This paper reports on the results of a survey of the financial planning and analysis techniques in use in the mining industry in Canada. The study was undertaken to determine the current status of these practices within mining firms in Canada and to investigate the extent to which the techniques are grouped together within individual firms. In addition, tests were performed on the relationship between these groups of techniques and both organizational size and price volatility of end product. The results show that a few techniques are widely utilized in this industry but that the techniques used most frequently are not as sophisticated as reported in previous, more broadly based surveys. The results also show that firms tend to use 'bundles' of techniques and that the relative use of some of these groups of techniques is weakly associated with both organizational size and type of end product. 19 refs., 7 tabs.

  20. Research Progress of Hepatic Blood Occlusion Techniques%肝血流阻断技术的研究进展

    Institute of Scientific and Technical Information of China (English)

    董勤

    2012-01-01

    目前治疗肝脏肿瘤的主要方法仍然是肝切除术,肝切除术中各种肝血流阻断技术一直是肝胆外科领域研究的热点.肝血流阻断方法种类较多,常用的和研究最多的是半肝血流阻断法和间歇性入肝血流阻断法.肝切除时根据病情选用合适的肝血流阻断法,可减少肝切除术中出血和保证患者安全.%Hepatectomy is still the main method for treatment of liver tumor, and hepatic blood occlusion techniques have been the research focus in the field of hepatobiliary surgery. The related literatures of hepatic blood occlusion was studied and hepatic blood occlusion methods were summarized. There were many hepatic blood occlusion methods, the most frequently used and studied techniques were hemihepatic vascular occlusion and intermittent hepatic inflow occlusion. Selecting suitable hepatic blood occlusion methods based on the situation of the patient can reduce blood loss in hepatectomy and ensure the patient's safety.

  1. Research Progress of Freezing Technique in Foods%食品冷冻技术的研究进展

    Institute of Scientific and Technical Information of China (English)

    张钟; 江潮

    2014-01-01

    The development of freezing technique is very rapidly.The using of freeze technique is more and more extensively.The application of the technology in food industry better as a arousing popular interest topic.This article mainly summarize the theoretical research in the course of freezing and the application in food industry.The present situation and development trend are indicated in this paper.%冷冻技术的发展异常迅速,在食品工业中的应用也越来越广泛,将其更好地应用于食品中成为当前研究者较为关注的课题。重点对冷冻过程中的理论研究及其在食品工业中的应用进行综述,并简述了近年来国内外食品冷冻技术的现状与发展趋势。

  2. An Analysis on the Discourse Cohesion Techniques in Children's English Books

    Institute of Scientific and Technical Information of China (English)

    罗春燕

    2014-01-01

    Discourse cohesion techniques analysis attracts much attention both at home and abroad and many scholars have con-ducted their research in this field, however, few of them focus on children’s English books which has its own characteristics and cohesion techniques and deserves our research.

  3. STUDY ON MODULAR FAULT TREE ANALYSIS TECHNIQUE WITH CUT SETS MATRIX METHOD

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    A new fault tree analysis (FTA) computation method is put forth by using modularization technique in FTA with cut sets matrix, and can reduce NP (Nondeterministic polynomial) difficulty effectively. This software can run in IBM-PC and DOS 3.0 and up. The method provides theoretical basis and computation tool for application of FTA technique in the common engineering system

  4. A comparative study on change vector analysis based change detection techniques

    Indian Academy of Sciences (India)

    Sartajvir Singh; Rajneesh Talwar

    2014-12-01

    Detection of Earth surface changes are essential to monitor regional climatic, snow avalanche hazard analysis and energy balance studies that occur due to air temperature irregularities. Geographic Information System (GIS) enables such research activities to be carried out through change detection analysis. From this viewpoint, different change detection algorithms have been developed for land-use land-cover (LULC) region. Among the different change detection algorithms, change vector analysis (CVA) has level headed capability of extracting maximuminformation in terms of overall magnitude of change and the direction of change between multispectral bands from multi-temporal satellite data sets. Since past two–three decades, many effective CVA based change detection techniques e.g., improved change vector analysis (ICVA), modified change vector analysis (MCVA) and change vector analysis posterior-probability space (CVAPS), have been developed to overcome the difficulty that exists in traditional change vector analysis (CVA). Moreover, many integrated techniques such as cross correlogram spectral matching (CCSM) based CVA. CVA uses enhanced principal component analysis (PCA) and inverse triangular (IT) function, hyper-spherical direction cosine (HSDC), and median CVA (m-CVA), as an effective LULC change detection tools. This paper comprises a comparative analysis on CVA based change detection techniques such as CVA, MCVA, ICVA and CVAPS. This paper also summarizes the necessary integrated CVA techniques along with their characteristics, features and shortcomings. Based on experiment outcomes, it has been evaluated that CVAPS technique has greater potential than other CVA techniques to evaluate the overall transformed information over three differentMODerate resolution Imaging Spectroradiometer (MODIS) satellite data sets of different regions. Results of this study are expected to be potentially useful for more accurate analysis of LULC changes which will, in turn

  5. Using Link Analysis Technique with a Modified Shortest-Path Algorithm to Fight Money Laundering

    Institute of Scientific and Technical Information of China (English)

    CHEN Yunkai; MAI Quanwen; LU Zhengding

    2006-01-01

    Effective link analysis techniques are needed to help law enforcement and intelligence agencies fight money laundering.This paper presents a link analysis technique that uses a modified shortest-path algorithms to identify the strongest association paths between entities in a money laundering network.Based on two-tree Dijkstra and Priority-First-Search (PFS) algorithm, a modified algorithm is presented.To apply the algorithm, a network representation transformation is made first.

  6. Data Analysis Techniques for Resolving Nonlinear Processes in Plasmas : a Review

    OpenAIRE

    de Wit, T. Dudok

    1996-01-01

    The growing need for a better understanding of nonlinear processes in plasma physics has in the last decades stimulated the development of new and more advanced data analysis techniques. This review lists some of the basic properties one may wish to infer from a data set and then presents appropriate analysis techniques with some recent applications. The emphasis is put on the investigation of nonlinear wave phenomena and turbulence in space plasmas.

  7. Pseudo-progression after stereotactic radiotherapy of brain metastases: lesion analysis using MRI cine-loops.

    Science.gov (United States)

    Wiggenraad, Ruud; Bos, Petra; Verbeek-de Kanter, Antoinette; Lycklama À Nijeholt, Geert; van Santvoort, Jan; Taphoorn, Martin; Struikmans, Henk

    2014-09-01

    Stereotactic radiotherapy (SRT) of brain metastasis can lead to lesion growth caused by radiation toxicity. The pathophysiology of this so-called pseudo-progression is poorly understood. The purpose of this study was to evaluate the use of MRI cine-loops for describing the consecutive events in this radiation induced lesion growth. Ten patients were selected from our department's database that had received SRT of brain metastases and had lesion growth caused by pseudo-progression as well as at least five follow-up MRI scans. Pre- and post SRT MRI scans were co-registered and cine-loops were made using post-gadolinium 3D T1 axial slices. The ten cine loops were discussed in a joint meeting of the authors. The use of cine-loops was superior to evaluation of separate MRI scans for interpretation of events after SRT. There was a typical lesion evolution pattern in all patients with varying time course. Initially regression of the metastases was observed, followed by an enlarging area of new contrast enhancement in the surrounding brain tissue. Analysis of consecutive MRI's using cine-loops may improve understanding of pseudo-progression. It probably represents a radiation effect in brain tissue surrounding the irradiated metastasis and not enlargement of the metastasis itself.

  8. Techniques and environments for big data analysis parallel, cloud, and grid computing

    CERN Document Server

    Dehuri, Satchidananda; Kim, Euiwhan; Wang, Gi-Name

    2016-01-01

    This volume is aiming at a wide range of readers and researchers in the area of Big Data by presenting the recent advances in the fields of Big Data Analysis, as well as the techniques and tools used to analyze it. The book includes 10 distinct chapters providing a concise introduction to Big Data Analysis and recent Techniques and Environments for Big Data Analysis. It gives insight into how the expensive fitness evaluation of evolutionary learning can play a vital role in big data analysis by adopting Parallel, Grid, and Cloud computing environments.

  9. Publishing nutrition research: a review of multivariate techniques--part 2: analysis of variance.

    Science.gov (United States)

    Harris, Jeffrey E; Sheean, Patricia M; Gleason, Philip M; Bruemmer, Barbara; Boushey, Carol

    2012-01-01

    This article is the eighth in a series exploring the importance of research design, statistical analysis, and epidemiology in nutrition and dietetics research, and the second in a series focused on multivariate statistical analytical techniques. The purpose of this review is to examine the statistical technique, analysis of variance (ANOVA), from its simplest to multivariate applications. Many dietetics practitioners are familiar with basic ANOVA, but less informed of the multivariate applications such as multiway ANOVA, repeated-measures ANOVA, analysis of covariance, multiple ANOVA, and multiple analysis of covariance. The article addresses all these applications and includes hypothetical and real examples from the field of dietetics.

  10. HTGR accident initiation and progression analysis status report. Volume V. AIPA fission product source terms

    Energy Technology Data Exchange (ETDEWEB)

    Alberstein, D.; Apperson, C.E. Jr.; Hanson, D.L.; Myers, B.F.; Pfeiffer, W.W.

    1976-02-01

    The primary objective of the Accident Initiation and Progression Analysis (AIPA) Program is to provide guidance for high-temperature gas-cooled reactor (HTGR) safety research and development. Among the parameters considered in estimating the uncertainties in site boundary doses are uncertainties in fission product source terms generated under normal operating conditions, i.e., fuel body inventories, circulating coolant activity, total plateout activity in the primary circuit, and plateout distributions. The volume presented documents the analyses of these source term uncertainties. The results are used for the detailed consequence evaluations, and they provide the basis for evaluation of fission products important for HTGR maintenance and shielding.

  11. Multivariate analysis of progressive thermal desorption coupled gas chromatography-mass spectrometry.

    Energy Technology Data Exchange (ETDEWEB)

    Van Benthem, Mark Hilary; Mowry, Curtis Dale; Kotula, Paul Gabriel; Borek, Theodore Thaddeus, III

    2010-09-01

    Thermal decomposition of poly dimethyl siloxane compounds, Sylgard{reg_sign} 184 and 186, were examined using thermal desorption coupled gas chromatography-mass spectrometry (TD/GC-MS) and multivariate analysis. This work describes a method of producing multiway data using a stepped thermal desorption. The technique involves sequentially heating a sample of the material of interest with subsequent analysis in a commercial GC/MS system. The decomposition chromatograms were analyzed using multivariate analysis tools including principal component analysis (PCA), factor rotation employing the varimax criterion, and multivariate curve resolution. The results of the analysis show seven components related to offgassing of various fractions of siloxanes that vary as a function of temperature. Thermal desorption coupled with gas chromatography-mass spectrometry (TD/GC-MS) is a powerful analytical technique for analyzing chemical mixtures. It has great potential in numerous analytic areas including materials analysis, sports medicine, in the detection of designer drugs; and biological research for metabolomics. Data analysis is complicated, far from automated and can result in high false positive or false negative rates. We have demonstrated a step-wise TD/GC-MS technique that removes more volatile compounds from a sample before extracting the less volatile compounds. This creates an additional dimension of separation before the GC column, while simultaneously generating three-way data. Sandia's proven multivariate analysis methods, when applied to these data, have several advantages over current commercial options. It also has demonstrated potential for success in finding and enabling identification of trace compounds. Several challenges remain, however, including understanding the sources of noise in the data, outlier detection, improving the data pretreatment and analysis methods, developing a software tool for ease of use by the chemist, and demonstrating our belief

  12. A novel no-insulation winding technique of high temperature-superconducting racetrack coil for rotating applications: A progress report in Korea university

    Science.gov (United States)

    Choi, Y. H.; Song, J. B.; Yang, D. G.; Kim, Y. G.; Hahn, S.; Lee, H. G.

    2016-10-01

    This paper presents our recent progress on core technology development for a megawatt-class superconducting wind turbine generator supported by the international collaborative R&D program of the Korea Institute of Energy Technology Evaluation and Planning. To outperform the current high-temperature-superconducting (HTS) magnet technology in the wind turbine industry, a novel no-insulation winding technique was first proposed to develop the second-generation HTS racetrack coil for rotating applications. Here, we briefly report our recent studies on no-insulation (NI) winding technique for GdBCO coated conductor racetrack coils in the following areas: (1) Charging-discharging characteristics of no-insulation GdBCO racetrack coils with respect to external pressures applied to straight sections; (2) thermal and electrical stabilities of no-insulation GdBCO racetrack coils encapsulated with various impregnating materials; (3) quench behaviors of no-insulation racetrack coils wound with GdBCO conductor possessing various lamination layers; (4) electromagnetic characteristics of no-insulation GdBCO racetrack coils under time-varying field conditions. Test results confirmed that this novel NI winding technique was highly promising. It could provide development of a compact, mechanically dense, and self-protecting GdBCO magnet for use in real-world superconducting wind turbine generators.

  13. Improving skill development: an exploratory study comparing a philosophical and an applied ethical analysis technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-09-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of ICT students and professionals. In particular the skill development focused on includes: being able to recognise ethical challenges and formulate coherent responses; distancing oneself from subjective judgements; developing ethical literacy; identifying stakeholders; and communicating ethical decisions made, to name a few.

  14. Scanning angle Raman spectroscopy: Investigation of Raman scatter enhancement techniques for chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Matthew W. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    This thesis outlines advancements in Raman scatter enhancement techniques by applying evanescent fields, standing-waves (waveguides) and surface enhancements to increase the generated mean square electric field, which is directly related to the intensity of Raman scattering. These techniques are accomplished by employing scanning angle Raman spectroscopy and surface enhanced Raman spectroscopy. A 1064 nm multichannel Raman spectrometer is discussed for chemical analysis of lignin. Extending dispersive multichannel Raman spectroscopy to 1064 nm reduces the fluorescence interference that can mask the weaker Raman scattering. Overall, these techniques help address the major obstacles in Raman spectroscopy for chemical analysis, which include the inherently weak Raman cross section and susceptibility to fluorescence interference.

  15. Coronary bifurcation lesions treated with double kissing crush technique compared to classical crush technique: serial intravascular ultrasound analysis

    Institute of Scientific and Technical Information of China (English)

    SHAN Shou-jie; YE Fei; LIU Zhi-zhong; TIAN Nai-liang; ZHANG Jun-jie; HEN Shao-liang

    2013-01-01

    Background The double kissing (DK) crush technique is a modified version of the crush technique.It is specifically designed to increase the success rate of the final kissing balloon post-dilatation,but its efficacy and safety remain unclear.Methods Data were obtained from the DKCRUSH-I trial,a prospective,randomized,multi-center study to evaluate safety and efficacy.Post-procedural and eight-month follow-up intravascular ultrasound (IVUS) analysis was available in 61 cases.Volumetric analysis using Simpson's method within the Taxus stent,and cross-sectional analysis at the five sites of the main vessel (MV) and three sites of the side branch (SB) were performed.Impact of the bifurcation angle on stent expansion at the carina was also evaluated.Results Stent expansion in the SB ostium was significantly less in the classical crush group ((53.81±13.51)%) than in the DK crush group ((72.27±11.46)%) (P=-0.04).For the MV,the incidence of incomplete crush was 41.9% in the DK group and 70.0% in the classical group (P=-0.03).The percentage of neointimal area at the ostium had a tendency to be smaller in the DK group compared with the classical group ((16.4±19.2)% vs.(22.8±27.1)%,P=-0.06).The optimal threshold of post-procedural minimum stent area (MSA) to predict follow-up minimum lumen area (MLA) <4.0 mm2 at the SB ostium was 4.55 mm2,yielding an area under the curve of 0.80 (95% confidence interval:0.61 to 0.92).Conclusion Our data suggest that the DK crush technique is associated with improved quality of the final kissing balloon inflation (FKBI) and had smaller optimal cutoff value of post-procedural MSA at the SB ostium.

  16. 硫磺回收工艺技术进展%Progress in the techniques of sulfur recovery processes

    Institute of Scientific and Technical Information of China (English)

    张文革; 黄丽月; 李军

    2011-01-01

    China has actualized more strict control to environment pollution, with the issuance of 《 Integrated emission standard of air pollutants》. And,there are many sulfur recovery plants constructed. By reviewing the present status of domestic sulfur recovery plants, some sulfur recovery techniques used in different industries were introduced and evaluated,Claus process was discussed emphatically.%随着《大气污染物综合排放标准》的发布与实施,中国对环境污染的控制日益严格,相关行业纷纷建设硫磺回收装置.作者结合国内硫磺回收装置的工艺技术和产能现状,介绍了不同行业的硫磺回收工艺技术,着重讨论了克劳斯法硫磺回收工艺技术,为硫磺回收装置的改建和扩建提供了参考.

  17. Development and application of analytical techniques to chemistry of donor solvent liquefaction. Quarterly progress report, April 1980-June 1980

    Energy Technology Data Exchange (ETDEWEB)

    Dorn, H. C.; Taylor, L. T.

    1980-10-01

    In this report we focus on our continuing efforts to develop new fluorine reagents useful for characterizing heteroatom (e.g., O, N and S) containing organic compounds present in coal and/or other fuel fractions. Although the utility of trifluoroacetyl chloride and trifluorodiazoethane as fluorine tagging reagents for studies of this type have been previously discussed, we describe further techniques utilizing these reagents which are under present study. It should be mentioned that the advantage of the fluorine tagging approach is the high sensitivity of the /sup 19/F nuclide to NMR detection and the high sensitivity of the /sup 19/F chemical shift parameter to subtle changes in molecular structure. Recently, workers at Exxon have reported the use of a silicon tagging reagent utilizing /sup 29/Si NMR to characterize various heteroatom containing organic moieties present in fuel samples. Silicon containing reagents have some of the same advantages as fluorine reagents. Namely the low levels of either fluorine or silicon in most samples of interest avoid obvious spectral background problems associated with hydrogen or carbon tagging reagents. Furthermore, like /sup 19/F, /sup 29/Si chemical shifts would be expected to be sensitive to subtle changes in molecular structure, however /sup 29/Si suffers from low NMR sensitivity.

  18. Clinical Significance of Optic Disc Progression by Topographic Change Analysis Maps in Glaucoma: An 8-Year Follow-Up Study

    Directory of Open Access Journals (Sweden)

    D. Kourkoutas

    2014-01-01

    Full Text Available Aim. To investigate the ability of Heidelberg Retina Tomograph (HRT3 Topographic Change Analysis (TCA map to predict the subsequent development of clinical change, in patients with glaucoma. Materials. 61 eyes of 61 patients, which, from a retrospective review were defined as stable on optic nerve head (ONH stereophotographs and visual field (VF, were enrolled in a prospective study. Eyes were classified as TCA-stable or TCA-progressed based on the TCA map. All patients underwent HRT3, VF, and ONH stereophotography at 9–12 months intervals. Clinical glaucoma progression was determined by masked assessment of ONH stereophotographs and VF Guided Progression Analysis. Results. The median (IQR total HRT follow-up period was 8.1 (7.3, 9.1 years, which included a median retrospective and prospective follow-up time of 3.9 (3.1, 5.0 and 4.0 (3.5, 4.7 years, respectively. In the TCA-stable eyes, VF and/or photographic progression occurred in 5/13 (38.4% eyes compared to 11/48 (22.9% of the TCA-progressed eyes. There was no statistically significant association between TCA progression and clinically relevant (photographic and/or VF progression (hazard ratio, 1.18; P=0.762. The observed median time to clinical progression from enrollment was significantly shorter in the TCA-progressed group compared to the TCA-stable group (P=0.04. Conclusion. Our results indicate that the commercially available TCA progression criteria do not adequately predict subsequent photographic and/or VF progression.

  19. Message Structures: a modelling technique for information systems analysis and design

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2011-01-01

    Despite the increasing maturity of model-driven software development (MDD), some research challenges remain open in the field of information systems (IS). For instance, there is a need to improve modelling techniques so that they cover several development stages in an integrated way, and they facilitate the transition from analysis to design. This paper presents Message Structures, a technique for the specification of communicative interactions between the IS and organisational actors. This technique can be used both in the analysis stage and in the design stage. During analysis, it allows abstracting from the technology that will support the IS, and to complement business process diagramming techniques with the specification of the communicational needs of the organisation. During design, Message Structures serves two purposes: (i) it allows to systematically derive a specification of the IS memory (e.g. a UML class diagram), (ii) and it allows to reason the user interface design using abstract patterns. Thi...

  20. [Research progress and application prospect of near infrared spectroscopy in soil nutrition analysis].

    Science.gov (United States)

    Ding, Hai-quan; Lu, Qi-peng

    2012-01-01

    "Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.

  1. Technical progress and energy substitutions in transport sector; Progres techniques et substitutions energetiques dans le secteur des transports

    Energy Technology Data Exchange (ETDEWEB)

    Florane, Philippe

    2002-11-15

    Alternative motorization technologies have been proposed in order to achieve energy diversification and a reduction in pollutant emissions. Fuel cell vehicles are, among others, at the centre of research carried out by car manufacturers and oil companies. The use of fuel cell vehicles could contribute, first, to a less stringent long-term energy dependence of oil importing countries and, second, to pollutant reduction in the transport sector. First of all, we propose the definition of 'innovation' and its treatment in the frame of mainstream economic theories. Then we proceed to a retrospective analysis of diesel motorization of the car market. In the second part of our work, we conduct a survey among French households aiming to obtain up-to-date information about their degree of acceptance of fuel cell technology. We are concerned about highlighting the determining factors of fuel cell vehicle adoption by consumers. For this, we set up a discrete choice model linking the individual decision to the whole group of technical or socio-economical factors and characteristics. Finally, we develop patterns of fuel cell equipment of passenger cars which differ according to type of vehicle and possible purchase assistance. These patterns lead us to the analysis of long-term fuel cell vehicle development on the French car market. (author)

  2. Progress in dehydration technique of bischofite%水氯镁石脱水技术的研究进展

    Institute of Scientific and Technical Information of China (English)

    王芹; 郭亚飞; 王士强; 邓天龙

    2011-01-01

    Magnesium compounds, metallic magnesium and its alloy have great industrial application value. There is abundant magnesium chloride in the ocean and salt lake brine in China. Especially, there is a large amount of bischofite ( MgCl2 · 6H2O) produced when the mother brine is continuously evaporated after potassium is separated from the salt lake brine. Although anhydrous magnesium chloride is the best raw material for magnesium electrolysis, there are lots of technical barriers in the dehydration of bischofite. Thus extensive use of bischofite resources is restricted. Present development status of bischofite dehydration technique,including the methods of gas environmental protection, double salt, and ammonia complexometry etc. ,was reviewed,and the considerable utilization prospect of magnesium resources in salt lake brine in China was also pointed out.%镁化合物与金属镁及其合金具有重大工业应用价值.中国海洋和盐湖卤水中含有巨量的氯化镁,尤其是盐湖提钾后老卤经蒸发具有宏量水氯镁石(MgCl2·6H2O)产出,水氯镁石经脱水后得到的无水氯化镁是电解镁最佳的原材料.然而,水氧镁石脱水过程中存在许多技术难题一直阻碍着水氯镁石资源的广泛应用.主要介绍水氯镁石脱水技术的发展现状,内容包括气体保护法、复盐法和氨络合法等,指出了中国盐湖镁资源的巨大利用前景.

  3. [Chronic progressive external ophthalmoplegia with mitochondrial anomalies. Clinical, histological, biochemical and genetic analysis (9 cases)].

    Science.gov (United States)

    Drouet, A

    1996-01-01

    We report the clinical signs and histological findings in nine patients with mitochondrial ocular myopathies. There were four males and five females. Of age ranging from 47 to 82 years. A more often asymetrical ptosis was in all cases of chronic progressive external ophtalmoplegia (CPEO), but muscle weakness in limbs was not usual. The prognosis in this group was good, but ubidecarenone (150 mg/d) used for two cases, did not improve ophtalmoplegia. The serum creatine kinase was normal in eight of nine cases and electromyography showed myopathic changes in three cases. Histoenzymatic analysis of the muscle biopsy and biochemical studies of mitochondria isolated from the muscle sample demonstrated mitochondrial myopathy associated with partial deficiency of complexes I and/or IV of the electron transfer chain. One of seven patients studied had single deletion by Southern blot analysis, in a heteroplasmic state and another an A-->G transition at position 3243 within the mitochondrial tRNA leu (UUR) gene. Chronic progressive external ophtalmoplegia, without large deletion, may have abnormality in other coding regions of mt DNA such as tRNA, rRNA or protein genes.

  4. Comparison of various procedures for progressive collapse analysis of cable-stayed bridges

    Institute of Scientific and Technical Information of China (English)

    Jian-guo CAI; Yi-xiang XU; Li-ping ZHUANG; Jian FENG; Jin ZHANG

    2012-01-01

    Alternate path (AP) method is the most widely used method for the progressive collapse analysis,and its application in frame structures has been well proved.However,the application of AP method for other structures,especially for cable-stayed structures,should be further developed.The four analytical procedures,i.e.,linear static,nonlinear static,linear dynamic,and nonlinear dynamic were firstly improved by taking into account the initial state.Then a cable-stayed structure was studied using the four improved methods.Furthermore,the losses of both one cable and two cables were discussed.The results show that for static and dynamic analyses of the cable-stayed bridges,there is large difference between the results obtained from simulations starting with either a deformed or a nondeformed configuration at the time of cable loss.The static results are conservative in the vicinity of the ruptured cable,but the dynamic effect of the cable loss in the area farther away from the loss-cable cannot be considered.Moreover,the dynamic amplification factor of 2.0 is found to be a good estimate for static analysis procedures,since linear static and linear dynamic procedures yield approximately the same maximum vertical deflection.The results of the comprehensive evaluation of the cable failure show that the tread of the progressive failure of the cable-stayed bridges decreases when the location of the failed cables is closer to the pylon.

  5. Developments in sanitary techniques 2011-2012. Important progress through studies in 2011; Ontwikkelingen sanitaire technieken 2011-2012. Belangrijke vorderingen door studies in 2011

    Energy Technology Data Exchange (ETDEWEB)

    Scheffer, W.

    2011-12-15

    In 2011, new laws and regulations were the main theme in sanitary techniques (ST). Libraries have been updated in the areas of tap water installations and sewer systems of buildings. Some important progress was made in the framework of several ST preliminary studies conducted by TVVL and Uneto-VNI. Still, the start-up of new ST studies and projects in 2012 is lagging behind compared to previous years. [Dutch] Het vakgebied van sanitaire technieken (ST) stond in 2011 vooral in het teken van nieuwe wet- en regelgeving. Zowel op het gebied van leidingwaterinstallaties als riolering van bouwwerken zijn de bibliotheken geactualiseerd. In het kader van enkele ST-voorstudies, uitgevoerd door TVVL en Uneto-VNI zijn belangrijke vorderingen gemaakt. De opstart van nieuwe ST-studies en -projecten in 2012 blijft echter achter ten opzichte van voorgaande jaren.

  6. Progress and prospect of fiber gratings thermal tuning techniques%光纤光栅温度调谐技术的发展与展望

    Institute of Scientific and Technical Information of China (English)

    张颖; 吴艳微; 李刚; 李伟; 李红杰; 陈佳妹

    2012-01-01

    介绍了光纤光栅调谐技术的基本原理.分别对基于半导体制冷器、电阻丝、双肩梁结构、镀膜光纤光栅的温度调谐方法进行了分析和综述.展望了光纤光栅温度调谐技术未来的发展.%The principle of fiber gratings tuning is presented The recent developments is analyzed and summarized in fiber gratings thermal tuning methods based on semiconductor cooler/heater , coil heater , double-shoulder beam device and thin film heater. The progress of fiber gratings thermal tuning techniques in future is looked ahead.

  7. 芭蕉芋的种质资源及栽培技术研究进展%Research Progress in Germplasm and Cultivation Technique of Canna edulis

    Institute of Scientific and Technical Information of China (English)

    欧珍贵; 周正邦; 周明强

    2012-01-01

    In order to promote the development of Canna edulis industry, systematic study on its cultivation has been taken both in China and abroad. Research progress in germplasm resource, nutrient, use, biological characteristic and cultivation technique of C. edulis were reviewed. The prospect of future research was also put forward.%为了推进芭蕉芋(Canna edulis Ker)的生产和发展,国内外学者对芭蕉芋的种植进行了深入系统的研究.在此基础上,对近年来国内外关于芭蕉芋种质资源、营养成分及用途、生理特性和栽培技术进行了综述,并针对芭蕉芋的研究进展展望了今后的研究方向.

  8. Progressive failure analysis of composite structure based on micro- and macro-mechanics models

    Institute of Scientific and Technical Information of China (English)

    孙志刚; 阮绍明; 陈磊; 宋迎东

    2015-01-01

    Based on parameter design language, a program of progressive failure analysis in composite structures is proposed. In this program, the relationship between macro- and micro-mechanics is established and the macro stress distribution of the composite structure is calculated by commercial finite element software. According to the macro-stress, the damaged point is found and the micro-stress distribution of representative volume element is calculated by finite-volume direct averaging micromechanics (FVDAM). Compared with the results calculated by failure criterion based on macro-stress field (the maximum stress criteria and Hashin criteria) and micro-stress field (Huang model), it is proven that the failure analysis based on macro- and micro-mechanics model is feasible and efficient.

  9. HULL GIRDER PROGRESSIVE COLLAPSE ANALYSIS USING IACS PRESCRIBED AND NLFEM DERIVED LOAD - END SHORTENING CURVES

    Directory of Open Access Journals (Sweden)

    Stanislav Kitarović

    2016-06-01

    Full Text Available This paper considers the hull girder ultimate strength of a bulk carrier at its midship section, as determined by an incremental-iterative progressive collapse analysis method prescribed by the International Association of Classification Societies Common Structural Rules for Bulk Carriers. In addition to the originally prescribed load – end shortening curves, curves determined by the nonlinear finite element method analysis (considering the influence of the idealized initial geometrical imperfections are also considered. Results obtained by both sets of curves are compared and discussed on both local (structural components load – end shortening curve and global (hull girder ulti-mate bending capacity and collapse sequence level, for both sagging and hogging cases.

  10. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  11. Auditing Information Structures in Organizations: A Review of Data Collection Techniques for Network Analysis

    NARCIS (Netherlands)

    Zwijze-Koning, Karen H.; Jong, de Menno D.T.

    2005-01-01

    Network analysis is one of the current techniques for investigating organizational communication. Despite the amount of how-to literature about using network analysis to assess information flows and relationships in organizations, little is known about the methodological strengths and weaknesses of

  12. Qualitative and quantitative analysis of lignocellulosic biomass using infrared techniques: A mini-review

    Science.gov (United States)

    Current wet chemical methods for biomass composition analysis using two-step sulfuric acid hydrolysis are time-consuming, labor-intensive, and unable to provide structural information about biomass. Infrared techniques provide fast, low-cost analysis, are non-destructive, and have shown promising re...

  13. Use of fuzzy techniques for analysis of dynamic loads in power systems

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Presents the use of fuzzy techniques for analysis of dynamic load characteristics of power systems to identify the voltage stability (collapse) of a weak bus and concludes from the consistent results obtained that this is a useful tool for analysis of load charactersitics of sophiscated power systems and their components.

  14. Automation of the verneuil technique on the basis of a stability analysis

    Science.gov (United States)

    Borodin, V. A.; Brener, E. A.; Tatarchenko, V. A.; Gusev, V. I.; Tsigler, I. N.

    1981-04-01

    This paper presents a stability analysis for the crystallization of large-size crystals grown by Verneuil techniques with variable powder feed rate. The laws of powder feed regulation are found, ensuring automatic maintenance of constant cross section of the growing crystal. Experimental verification of the results of the theoretical analysis is obtained.

  15. Meta-analysis of gene expression signatures defining the epithelial to mesenchymal transition during cancer progression.

    Directory of Open Access Journals (Sweden)

    Christian J Gröger

    Full Text Available The epithelial to mesenchymal transition (EMT represents a crucial event during cancer progression and dissemination. EMT is the conversion of carcinoma cells from an epithelial to a mesenchymal phenotype that associates with a higher cell motility as well as enhanced chemoresistance and cancer stemness. Notably, EMT has been increasingly recognized as an early event of metastasis. Numerous gene expression studies (GES have been conducted to obtain transcriptome signatures and marker genes to understand the regulatory mechanisms underlying EMT. Yet, no meta-analysis considering the multitude of GES of EMT has been performed to comprehensively elaborate the core genes in this process. Here we report the meta-analysis of 18 independent and published GES of EMT which focused on different cell types and treatment modalities. Computational analysis revealed clustering of GES according to the type of treatment rather than to cell type. GES of EMT induced via transforming growth factor-β and tumor necrosis factor-α treatment yielded uniformly defined clusters while GES of models with alternative EMT induction clustered in a more complex fashion. In addition, we identified those up- and downregulated genes which were shared between the multitude of GES. This core gene list includes well known EMT markers as well as novel genes so far not described in this process. Furthermore, several genes of the EMT-core gene list significantly correlated with impaired pathological complete response in breast cancer patients. In conclusion, this meta-analysis provides a comprehensive survey of available EMT expression signatures and shows fundamental insights into the mechanisms that are governing carcinoma progression.

  16. Research Progress on Rapid Detection Technique of Microorganism in Raw Milk%生乳微生物快速检测技术研究进展

    Institute of Scientific and Technical Information of China (English)

    孔丽娜; 李祖明; 吴聪明; 许文涛

    2013-01-01

    随着乳品工业的迅速发展,研究和建立生乳微生物快速检测技术以加强对乳品卫生安全检测越来越受到各国的重视。本文对生乳微生物快速检测技术的原理、特点和研究进展进行了综述,包括普通PCR、实时荧光定量PCR、PCR-DGGE、基因芯片、ELISA、电化学阻抗、ATP生物发光法、流式细胞计数法、还原法和微生物自动检测仪等。最后对生乳微生物快速检测技术研究的广阔前景作了展望。%With the rapid development of milk food industry , studying and establishing rapid detection technique of microorganism in raw milk to strengthen the monitoring of hygiene and safety of milk food is paid more and more attention to by various countries. The research progress , principle and characteristic of the rapid detection technique of microorganism in raw milk were summarized in this paper , which including ordinary PCR, real-time fluorescent quantitative PCR, PCR-DGGE, gene chip, ELISA, electrochemical impedance, bioluminescence technique, flow cyLometry, reduction test and microbial automatic detection system. Finally , the future prospect of rapid detection technique of microorganism in raw milk was forecasted.

  17. Acoustic Emission Analysis of Damage Progression in Thermal Barrier Coatings Under Thermal Cyclic Conditions

    Science.gov (United States)

    Appleby, Matthew; Zhu, Dongming; Morscher, Gregory

    2015-01-01

    Damage evolution of electron beam-physical vapor deposited (EBVD-PVD) ZrO2-7 wt.% Y2O3 thermal barrier coatings (TBCs) under thermal cyclic conditions was monitored using an acoustic emission (AE) technique. The coatings were heated using a laser heat flux technique that yields a high reproducibility in thermal loading. Along with AE, real-time thermal conductivity measurements were also taken using infrared thermography. Tests were performed on samples with induced stress concentrations, as well as calcium-magnesium-alumino-silicate (CMAS) exposure, for comparison of damage mechanisms and AE response to the baseline (as-produced) coating. Analysis of acoustic waveforms was used to investigate damage development by comparing when events occurred, AE event frequency, energy content and location. The test results have shown that AE accumulation correlates well with thermal conductivity changes and that AE waveform analysis could be a valuable tool for monitoring coating degradation and provide insight on specific damage mechanisms.

  18. Salient Feature Identification and Analysis using Kernel-Based Classification Techniques for Synthetic Aperture Radar Automatic Target Recognition

    Science.gov (United States)

    2014-03-27

    SALIENT FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION...FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION THESIS Presented...SALIENT FEATURE IDENTIFICATION AND ANALYSIS USING KERNEL-BASED CLASSIFICATION TECHNIQUES FOR SYNTHETIC APERTURE RADAR AUTOMATIC TARGET RECOGNITION

  19. A new chromosome fluorescence banding technique combining DAPI staining with image analysis in plants.

    Science.gov (United States)

    Liu, Jing Yu; She, Chao Wen; Hu, Zhong Li; Xiong, Zhi Yong; Liu, Li Hua; Song, Yun Chun

    2004-08-01

    In this study, a new chromosome fluorescence banding technique was developed in plants. The technique combined 4',6-diamidino-2-phenylindole (DAPI) staining with software analysis including three-dimensional imaging after deconvolution. Clear multiple and adjacent DAPI bands like G-bands were obtained by this technique in the tested species including Hordeum vulgare L., Oryza officinalis, Wall & Watt, Triticum aestivum L., Lilium brownii, Brown, and Vicia faba L. During mitotic metaphase, the numbers of bands for the haploid genomes of these species were about 185, 141, 309, 456 and 194, respectively. Reproducibility analysis demonstrated that banding patterns within a species were stable at the same mitotic stage and they could be used for identifying specific chromosomes and chromosome regions. The band number fluctuated: the earlier the mitotic stage, the greater the number of bands. The technique enables genes to be mapped onto specific band regions of the chromosomes by only one fluorescence in situ hybridisation (FISH) step with no chemical banding treatments. In this study, the 45S and 5S rDNAs of some tested species were located on specific band regions of specific chromosomes and they were all positioned at the interbands with the new technique. Because no chemical banding treatment was used, the banding patterns displayed by the technique should reflect the natural conformational features of chromatin. Thus it could be expected that this technique should be suitable for all eukaryotes and would have widespread utility in chromosomal structure analysis and physical mapping of genes.

  20. Application of a sensitivity analysis technique to high-order digital flight control systems

    Science.gov (United States)

    Paduano, James D.; Downing, David R.

    1987-01-01

    A sensitivity analysis technique for multiloop flight control systems is studied. This technique uses the scaled singular values of the return difference matrix as a measure of the relative stability of a control system. It then uses the gradients of these singular values with respect to system and controller parameters to judge sensitivity. The sensitivity analysis technique is first reviewed; then it is extended to include digital systems, through the derivation of singular-value gradient equations. Gradients with respect to parameters which do not appear explicitly as control-system matrix elements are also derived, so that high-order systems can be studied. A complete review of the integrated technique is given by way of a simple example: the inverted pendulum problem. The technique is then demonstrated on the X-29 control laws. Results show linear models of real systems can be analyzed by this sensitivity technique, if it is applied with care. A computer program called SVA was written to accomplish the singular-value sensitivity analysis techniques. Thus computational methods and considerations form an integral part of many of the discussions. A user's guide to the program is included. The SVA is a fully public domain program, running on the NASA/Dryden Elxsi computer.

  1. 中国石化提高采收率技术研究进展与应用%Research progress and application of EOR techniques in SINOPEC

    Institute of Scientific and Technical Information of China (English)

    计秉玉; 王友启; 聂俊; 张莉; 于洪敏; 何应付

    2016-01-01

    This paper introduced the research progress and application of the EOR techniques such as chemical flooding, heavy oil thermal recovery,gas flooding and microbial flooding,summarized the application conditions,application results and existing problems for various techniques, and disscussed the direction of the large-scale application of EOR tech-niques for SINOPEC.The results show that chemical flooding has become the predominant EOR technology EOR projects of SINOPEC are under.Polymer and SP flooding techniques are mature and complete in supporting technologies,and have been put into industrial application technologies in China,with well developed relating techniques,and in the phase of in-dustrialization.Heterogeneous Post-polymer flooding test of non-homogeneous composite flooding technique has been test successfully carried out in the 1st Block of central Gudao,Shengli oilfield,and after polymer flooding which can be expec-ted to enhanced oil recovery of 7.3%is expected.Among heavy oil thermal recovery techniques,thermochemical puff &huff,well pattern infilling and thermal recovery of conventional heavy oil techniques are in the process of generalization and application have had large scale field applications,while steam injection and thermochemical steam injection tech-niques are still under research.SINOPEC is currently carrying out pilots for gas flooding and microbial flooding which can be expected to with the expectation of enhanced oil recovery increase.The focuses of SINOPEC’ s industrial application of EOR techniques in the future are displacing agent or displacing system capable of being adapted to harsh reservoir condi-tions,mobility control technique and combined application of mature technologies.%从化学驱油技术、稠油热采技术、注气驱油技术和微生物驱油技术4个方面,对中国石化提高采收率技术的研究进展及应用进行了概述,归纳了各项技术的适用条件、应用结果及存在问题,并讨

  2. The “Iron Screen”: Progressive Development of an Interpretative Analysis of Body Iron Status

    OpenAIRE

    Beck, J. Robert; Turpin, Edward H.; Rawnsley, Howard M

    1982-01-01

    The Iron Screen, a sequential Bayesian decision-making model for the assessment of body iron stores in man, is a special example of the blending of computer support, decision analysis, and multivariate statistical techniques in medicine. This model, originally developed and tested in patients with chronic diseases and a question of iron deficiency anemia, has recently been broadened to include all pateints for whom body iron stores must be determined. In this report we describe operational ex...

  3. Exploratory analysis of osteoarthritis progression among medication users: data from the Osteoarthritis Initiative

    Science.gov (United States)

    Driban, Jeffrey B.; Lo, Grace H.; Eaton, Charles B.; Lapane, Kate L.; Nevitt, Michael; Harvey, William F.; McCulloch, Charles E.; McAlindon, Timothy E.

    2016-01-01

    Background: We conducted an exploratory analysis of osteoarthritis progression among medication users in the Osteoarthritis Initiative to identify interventions or pathways that may be associated with disease modification and therefore of interest for future clinical trials. Methods: We used participants from the Osteoarthritis Initiative with annual medication inventory data between the baseline and 36-month follow-up visit (n = 2938). Consistent medication users were defined for each medication classification as a participant reporting at all four annual visits that they were regularly using an oral prescription medication at the time of the visit. The exploratory analysis focused on medication classes with 40 or more users. The primary outcome measures were medial tibiofemoral joint space width change and the Western Ontario and McMaster Universities Arthritis Index (WOMAC) knee pain score change (12–36-month visits). Within each knee, we explored eight comparisons between users and matched or unmatched nonusers (defined two ways). An effect size of each comparison was calculated. Medication classes had potential signals if (a) both knees had less progression among users compared with nonusers, or (b) there was less progression based on structure and symptoms in one knee. Results: We screened 28 medication classes. Six medication classes had signals for fewer structural changes and better knee pain changes: alpha-adrenergic blockers, antilipemic (excluding statins and fibric acid), anticoagulants, selective serotonin reuptake inhibitors, antihistamines, and antineoplastic agents. Four medication classes had signals for structural changes alone: anti-estrogen (median effect size = 0.28; range = −0.41–0.64), angiotensin-converting enzyme inhibitors (median effect size = 0.13; range = −0.08–0.28), beta-adrenergic blockers (median effect size = 0.09; range = 0.01–0.30), and thyroid agents (median effect size = 0.04; range = −0.05–0.14). Thiazide

  4. Development of advanced in situ techniques for chemistry monitoring and corrosion mitigation in SCWO environments. 1997 annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Meng, Z.; Zhou, X.Y.; Lvov, S.N.; Macdonald, D.D.

    1997-10-01

    'This report evaluates the first year''s results of the research on the development of advanced electrochemical sensors for use in high subcritical and supercritical aqueous environments. The work has emphasized the designing of an advanced reference electrode, and the development of high-temperature pH and redox sensors for characterizing the fundamental properties of supercritical aqueous solutions. Also, electrochemical noise sensors have been designed for characterizing metal/water interactions, including corrosion processes. A test loop has been designed and constructed to meet the expected operation conditions. The authors have also developed an approach to define a practical pH scale for use with supercritical aqueous systems and an operational electrochemical thermocell was tested for pH measurements in HCl + NaCl aqueous solutions. The potentials of the thermocell for several HCl(aq) solutions of different concentrations have been measured over wide ranges of temperature from 25 to 400 C and for flow rates from 0.1 to 1.5 cm min{sup -1} . The corresponding pH differences ({Delta}pH) for two HCl(aq) concentrations in 0.1 NaCl(aq) solution have been experimentally derived and thermodynamically analyzed. Their first experimental measurements, and subsequent theoretical analysis, clearly demonstrate the viability of pH measurements in high subcritical and supercritical aqueous solutions with a high accuracy of \\2610.02 to 0.05 units.'

  5. Progress report on neutron activation analysis at Dalat Nuclear Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Tuan, Nguyen Ngoc [Nuclear Research Institute, Dalat (Viet Nam)

    2003-03-01

    Neutron Activation Analysis (NAA) is one of most powerful techniques for the simultaneous multi-elements analysis. This technique has been studied and applied to analyze major, minor and trace elements in Geological, Biological and Environmental samples at Dalat Nuclear Research Reactor. At the sixth Workshop, February 8-11, 1999, Yojakarta, Indonesia we had a report on Current Status of Neutron Activation Analysis using Dalat Nuclear Research Reactor. Another report on Neutron Activation Analysis at the Dalat Nuclear Research Reactor also was presented at the seventh Workshop in Taejon, Korea from November 20-24, 2000. So in this report, we would like to present the results obtained of the application of NAA at NRI for one year as follows: (1) Determination of the concentrations of noble, rare earth, uranium, thorium and other elements in Geological samples according to requirement of clients particularly the geologists, who want to find out the mineral resources. (2) The analysis of concentration of radionuclides and nutrient elements in foodstuffs to attend the program on Asian Reference Man. (3) The evaluation of the contents of trace elements in crude oil and basement rock samples to determine original source of the oil. (4) Determination of the elemental composition of airborne particle in the Ho Chi Minh City for studying air pollution. The analytical data of standard reference material, toxic elements and natural radionuclides in seawater are also presented. (author)

  6. Technique for continuous high-resolution analysis of trace substances in firn and ice cores

    Energy Technology Data Exchange (ETDEWEB)

    Roethlisberger, R.; Bigler, M.; Hutterli, M.; Sommer, S.; Stauffer, B.; Junghans, H.G.; Wagenbach, D.

    2000-01-15

    The very successful application of a CFA (Continuous flow analysis) system in the GRIP project (Greenland Ice Core Project) for high-resolution ammonium, calcium, hydrogen peroxide, and formaldehyde measurements along a deep ice core led to further development of this analysis technique. The authors included methods for continuous analysis technique. The authors included methods for continuous analysis of sodium, nitrate, sulfate, and electrolytical conductivity, while the existing methods have been improved. The melting device has been optimized to allow the simultaneous analysis of eight components. Furthermore, a new melter was developed for analyzing firn cores. The system has been used in the frame of the European Project for Ice Coring in Antarctica (EPICA) for in-situ analysis of several firn cores from Dronning Maud Land, Antarctica, and for the new ice core drilled at Dome C, Antarctica.

  7. Artificial intelligence techniques used in respiratory sound analysis--a systematic review.

    Science.gov (United States)

    Palaniappan, Rajkumar; Sundaraj, Kenneth; Sundaraj, Sebastian

    2014-02-01

    Artificial intelligence (AI) has recently been established as an alternative method to many conventional methods. The implementation of AI techniques for respiratory sound analysis can assist medical professionals in the diagnosis of lung pathologies. This article highlights the importance of AI techniques in the implementation of computer-based respiratory sound analysis. Articles on computer-based respiratory sound analysis using AI techniques were identified by searches conducted on various electronic resources, such as the IEEE, Springer, Elsevier, PubMed, and ACM digital library databases. Brief descriptions of the types of respiratory sounds and their respective characteristics are provided. We then analyzed each of the previous studies to determine the specific respiratory sounds/pathology analyzed, the number of subjects, the signal processing method used, the AI techniques used, and the performance of the AI technique used in the analysis of respiratory sounds. A detailed description of each of these studies is provided. In conclusion, this article provides recommendations for further advancements in respiratory sound analysis.

  8. The Delphi Technique in nursing research - Part 3: Data Analysis and Reporting

    Directory of Open Access Journals (Sweden)

    Dimitrios Kosmidis

    2013-07-01

    Full Text Available The Delphi technique is a research method with a multitude of literature regarding its application, yet there is limited guidance on methods of analysis and the presentation of results. Aim: To describe and critically analyze the main methods of the qualitative and quantitative data analysis in studies using the Delphi Technique. Materials and methods: The literature search included research and review articles of nursing interest within the following databases: IATROTEK, Medline, Cinahl and Scopus, from 2001 to 2011. Key-words were used Delphi technique, nursing and research methodology, in English and Greek language. Results: The literature search revealed 285 articles of nursing interest (266 reserch articles and 19 reviews. Data analysis in formal methodology surveys using the Delphi technique initially involves a qualitative analysis of the experts' views which are gathered during the first round. Subsequently, various statistical analyses methods are employed in order to estimate the final level of consensus on the coming rounds (iterations. Prescribing the desired degree of consensus is usually based on subjective assumptions, while the final identification is done with mainly via descriptive and inductive statistical measures. In the presentation of results, simple tables are mainly used which are based on descriptive data, statistical criteria or scatter charts in order to illustrate the experts' opinions. Conclusions: The Delphi Technique has infiltrated nursing research with great variability in data analysis methodology and presentation of results depending on each study's aims and characteristics.

  9. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  10. Investigation of microscopic radiation damage in waste forms using ODNMR and AEM techniques. 1997 annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    Liu, G.

    1997-09-01

    'This project seeks to understand the microscopic effects of radiation damage in nuclear waste forms. The authors approach to this challenge encompasses studies in electron microscopy, laser spectroscopy, and computational modeling and simulation. During this first year of the project, efforts have focused on a-decay induced microscopic damage in crystalline orthophosphates (YPO{sub 4} and LuPO{sub 4}) that contain the short-lived a-emitting isotope {sup 244}Cm (t{sub 1/2} = 18.1 y). The samples that they studied were synthesized in 1980 and the initial {sup 244}Cm concentration was {approximately}1%. Studying these materials is of importance to nuclear waste management because of the opportunity to gain insight into accumulated radiation damage and the influence of aging on such damage. These factors are critical to the long-term performance of actual waste forms [1]. Lanthanide orthophosphates, including LuPO{sub 4} and YPO{sub 4}, have been suggested as waste forms for high level nuclear waste [2] and potential hosts for excess weapons plutonium [3,4]. The work is providing insight into the characteristics of these previously known radiation-resistant materials. They have observed loss of crystallinity (partial amorphization) as a direct consequence of prolonged exposure to intense alpha radiolysis in these materials. More importantly, the observation of microscopic cavities in these aged materials provides evidence of significant chemical decomposition that may be difficult to detect in the earlier stages of radiation damage. The preliminary results show that, in characterizing crystalline compounds as high level nuclear waste forms, chemical decomposition effects may be more important than lattice amorphization which has been the focus of many previous studies. More extensive studies, including in-situ analysis of the dynamics of thermal annealing of self-radiation induced amorphization and cavity formation, will be conducted on these aged {sup 244}Cm

  11. 能源草本植物繁殖技术研究进展%Research Progress on Propagation Techniques of Bioenergy Grasses

    Institute of Scientific and Technical Information of China (English)

    魏娟; 王学华; 肖亮

    2015-01-01

    综述了国内外近年来在芒草(Miscanthus)、柳枝稷(Panicum virgatum)、芦竹(Arundodonaxl)、皇草(Pennise-tum sinense Roxd)、象草(Pennisetum purpureumk)等典型能源草本植物的有性和无性繁殖技术方面的研究进展,总结了能源草本植物繁殖技术中存在的问题,并就解决方法和发展方向进行了展望。%The research progress on sexual and asexual reproduction techniques of bioenergy herb plants,suuch as Miscant-hus,Panicum virgatum,Arundodonaxl,Pennisetum sinense Roxd,Pennisetum purpureumk,in recent years were reviewed,and the problems on propagation techniques of bioenergy herb plant were summarized,and the solutions and development direc-tions were prospected.

  12. 薄壳山核桃无性繁殖技术研究进展%Research Progress of Asexual Propagation Techniques of Carya illinoensis

    Institute of Scientific and Technical Information of China (English)

    李俊南; 李莲芳; 熊新武; 陈宏伟; 习学良

    2012-01-01

    综述了薄壳山核桃无性繁殖技术的研究进展,指出薄壳山核桃是国外引进的高档干果树种和特色新产业.薄壳山核桃无性繁殖技术研究现状包括砧木培育,嫁接技术及影响嫁接成活的因素,扦插繁殖,分株繁殖等栽培研究现状,提出目前薄壳山核桃无性繁殖存在的问题,旨在为薄壳山核桃的良种繁育提供参考.%The research progress of Carya illinoensis asexual propagation technology was summarized. The research status of asexual propagation technique Of Carya illinoensis include rootstock cultivation, grafting techniques and influencing factors, cutting propagation, division propagation. The main existing problems were put forward, so as to provide reference for fine variety breeding of Carya illinoensis.

  13. The Cervical Cancer Screening Technique of Application and Progression%子宫颈癌筛查方法的应用及研究进展

    Institute of Scientific and Technical Information of China (English)

    张文娟; 东燕; 孙绪兰; 王萍; 曾燕

    2012-01-01

    Cervical cancer is a common gynecologic tumor and it is the second-leading cause of death from cancer in women worldwide. It is critical to construct a sustainable, reasonable and efficient general survey method for the early detection of precancerosis of Cervical Cancer. At present, cervical cancer screening technique includes: (a) cervical cytological screening (papanicolaou smear, TCT and DNA quantification); (b) visual inspection with acetic acid (VIA) and visual inspection with Lugol'S iodine (VH.I) ; (c) vaginoscopy; (d) HPV detection. The article is to summarize the application and progression of technique for cervical cancer screening.%子宫颈癌是妇科常见恶性肿瘤之一,发病率居女性恶性肿瘤第二位.建立可持续、合理、有效的普查方法,早期发现癌前病变是防治宫颈癌的关键.目前宫颈癌筛查方法有宫颈细胞学筛查(巴氏涂片、液基薄层细胞学检测、细胞DNA定量分析技术)、肉眼观察辅以醋酸白和Lugol碘溶液检测法、阴道镜检查、病毒-HPV检测,本文就当前各种筛查方法的应用及研究进展进行概括.

  14. THE 'HYBRID' TECHNIQUE FOR RISK ANALYSIS OF SOME DISEASES

    Institute of Scientific and Technical Information of China (English)

    SHANGHANJI; LUYUCHU; XUXUEMEI; CHENQIAN

    2001-01-01

    Based on the data obtained from a survey recently made in Shanghai, this paper presents the hybrid technique for risk analysis and evaluation of some diseases. After determination of main risk factors of these diseases by analysis of variance, the authors introduce a new concept 'Illness Fuzzy Set' and use fuzzy comprehensive evaluation to evaluate the risk of suffering from a disease for residents. Optimal technique is used to determinethe weights wi in fuzzy comprehensive evaluation, and a new method 'Improved Information Distribution' is also introduced for the treatment of small sample problem. It is shown that the results obtained by using the hybrid technique are better than by using single fuzzy technique or single statistical method.

  15. THE ‘HYBRID’ TECHNIQUE FOR RISK ANALYSIS OF SOME DISEASES

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on the data obtained from a survey recently made in Shanghai, this paper presents the hybrid technique for risk analysis and evaluation of some diseases. After determination of main risk factors of these diseases by analysis of variance, the authors introduce a new concept ‘Illness Fuzzy Set’ and use fuzzy comprehensive evaluation to evaluate the risk of suffering from a disease for residents. Optimal technique is used to determine the weights wi in fuzzy comprehensive evaluation, and a new method ‘Improved Information Distribution’ is also introduced for the treatment of small sample problem. It is shown that the results obtained by using the hybrid technique are better than by using single fuzzy technique or single statistical method.

  16. A Comparative Analysis of Techniques for PAPR Reduction of OFDM Signals

    Directory of Open Access Journals (Sweden)

    M. Janjić

    2014-06-01

    Full Text Available In this paper the problem of high Peak-to-Average Power Ratio (PAPR in Orthogonal Frequency-Division Multiplexing (OFDM signals is studied. Besides describing three techniques for PAPR reduction, SeLective Mapping (SLM, Partial Transmit Sequence (PTS and Interleaving, a detailed analysis of the performances of these techniques for various values of relevant parameters (number of phase sequences, number of interleavers, number of phase factors, number of subblocks depending on applied technique, is carried out. Simulation of these techniques is run in Matlab software. Results are presented in the form of Complementary Cumulative Distribution Function (CCDF curves for PAPR of 30000 randomly generated OFDM symbols. Simulations are performed for OFDM signals with 32 and 256 subcarriers, oversampled by a factor of 4. A detailed comparison of these techniques is made based on Matlab simulation results.

  17. Elemental Analysis of Lapis Lazuli sample, using complementary techniques of IBIL and MicroPIXE

    Directory of Open Access Journals (Sweden)

    T Nikbakht

    2015-07-01

    Full Text Available Ion Beam Induced Luminescence (IBIL is a useful IBA technique which could be utilized to obtain information about the nature of chemical bonds in materials. Regarding the probed area, this non-destructive and fast technique is a suitable complementary one for MicroPIXE. Since most minerals are luminescent, IBIL is an applicable analytical technique in mineralogy. In this research work, to characterize a Lapis lazuli sample, a 2.7 MeV proton beam is utilized. After data collection and analysis of the results obtained from both techniques of IBIL and MicroPIXE, elemental maps of the sample were developed. Comparison of the results with other available ones in the literature indicates the capability and accuracy of the combination of the two complementary techniques for characterization of minerals as well as precious historical objects

  18. Analysis of Far-Field Radiation from Apertures Using Monte Carlo Integration Technique

    Directory of Open Access Journals (Sweden)

    Mohammad Mehdi Fakharian

    2014-12-01

    Full Text Available An integration technique based on the use of Monte Carlo Integration (MCI is proposed for the analysis of the electromagnetic radiation from apertures. The technique that can be applied to the calculation of the aperture antenna radiation patterns is the equivalence principle followed by physical optics, which can then be used to compute far-field antenna radiation patterns. However, this technique is often complex mathematically, because it requires integration over the closed surface. This paper presents an extremely simple formulation to calculate the far-fields from some types of aperture radiators by using MCI technique. The accuracy and effectiveness of this technique are demonstrated in three cases of radiation from the apertures and results are compared with the solutions using FE simulation and Gaussian quadrature rules.

  19. Fault detection in digital and analog circuits using an i(DD) temporal analysis technique

    Science.gov (United States)

    Beasley, J.; Magallanes, D.; Vridhagiri, A.; Ramamurthy, Hema; Deyong, Mark

    1993-01-01

    An i(sub DD) temporal analysis technique which is used to detect defects (faults) and fabrication variations in both digital and analog IC's by pulsing the power supply rails and analyzing the temporal data obtained from the resulting transient rail currents is presented. A simple bias voltage is required for all the inputs, to excite the defects. Data from hardware tests supporting this technique are presented.

  20. Identifying Indicators of Progress in Thermal Spray Research Using Bibliometrics Analysis

    Science.gov (United States)

    Li, R.-T.; Khor, K. A.; Yu, L.-G.

    2016-08-01

    We investigated the research publications on thermal spray in the period of 1985-2015 using the data from Web of Science, Scopus and SciVal®. Bibliometrics analysis was employed to elucidate the country and institution distribution in various thermal spray research areas and to characterize the trends of topic change and technology progress. Results show that China, USA, Japan, Germany, India and France were the top countries in thermal spray research, and Xi'an Jiaotong University, Universite de Technologie Belfort-Montbeliard, Shanghai Institute of Ceramics, ETH Zurich, National Research Council of Canada, University of Limoges were among the top institutions that had high scholarly research output during 2005-2015. The terms of the titles, keywords and abstracts of the publications were analyzed by the Latent Dirichlet Allocation model and visually mapped using the VOSviewer software to reveal the progress of thermal spray technology. It is found that thermal barrier coating was consistently the main research area in thermal spray, and high-velocity oxy-fuel spray and cold spray developed rapidly in the last 10 years.

  1. Proteomic analysis reveals novel proteins associated with progression and differentiation of colorectal carcinoma

    Directory of Open Access Journals (Sweden)

    Yi Gan

    2014-01-01

    Full Text Available Aim: The objective of this study is to characterize differential proteomic expression among well-differentiation and poor-differentiation colorectal carcinoma tissues and normal mucous epithelium. Materials and Methods: The study is based on quantitative 2-dimensional gel electrophoresis and analyzed by PDquest. Results: Excluding redundancies due to proteolysis and posttranslational modified isoforms of over 600 protein spots, 11 proteins were revealed as regulated with statistical variance being within the 95 th confidence level and were identified by peptide mass fingerprinting in matrix assisted laser desorption/ionization time-of-flight mass spectrometry. Progression-associated proteins belong to the functional complexes of tumorigenesis, proliferation, differentiation, metabolism, and the regulation of major histocompatibility complex processing and other functions. Partial but significant overlap was revealed with previous proteomics and transcriptomics studies in CRC. Among various differentiation stage of CRC tissues, we identified calreticulin precursor, MHC class I antigen (human leukocyte antigen A , glutathione S-transferase pi1, keratin 8, heat shock protein 27, tubulin beta chain, triosephosphate, fatty acid-binding protein, hemoglobin (deoxy mutant with val b 1 replaced by met (HBB, and zinc finger protein 312 (FEZF2. Conclusions: Their functional networks were analyzed by Ingenuity systems Ingenuity Pathways Analysis and revealed the potential roles as novel biomarkers for progression in various differentiation stages of CRC.

  2. Identifying Indicators of Progress in Thermal Spray Research Using Bibliometrics Analysis

    Science.gov (United States)

    Li, R.-T.; Khor, K. A.; Yu, L.-G.

    2016-12-01

    We investigated the research publications on thermal spray in the period of 1985-2015 using the data from Web of Science, Scopus and SciVal®. Bibliometrics analysis was employed to elucidate the country and institution distribution in various thermal spray research areas and to characterize the trends of topic change and technology progress. Results show that China, USA, Japan, Germany, India and France were the top countries in thermal spray research, and Xi'an Jiaotong University, Universite de Technologie Belfort-Montbeliard, Shanghai Institute of Ceramics, ETH Zurich, National Research Council of Canada, University of Limoges were among the top institutions that had high scholarly research output during 2005-2015. The terms of the titles, keywords and abstracts of the publications were analyzed by the Latent Dirichlet Allocation model and visually mapped using the VOSviewer software to reveal the progress of thermal spray technology. It is found that thermal barrier coating was consistently the main research area in thermal spray, and high-velocity oxy-fuel spray and cold spray developed rapidly in the last 10 years.

  3. Rate Dependent Multicontinuum Progressive Failure Analysis of Woven Fabric Composite Structures under Dynamic Impact

    Directory of Open Access Journals (Sweden)

    James Lua

    2004-01-01

    Full Text Available Marine composite materials typically exhibit significant rate dependent response characteristics when subjected to extreme dynamic loading conditions. In this work, a strain-rate dependent continuum damage model is incorporated with multicontinuum technology (MCT to predict damage and failure progression for composite material structures. MCT treats the constituents of a woven fabric composite as separate but linked continua, thereby allowing a designer to extract constituent stress/strain information in a structural analysis. The MCT algorithm and material damage model are numerically implemented with the explicit finite element code LS-DYNA3D via a user-defined material model (umat. The effects of the strain-rate hardening model are demonstrated through both simple single element analyses for woven fabric composites and also structural level impact simulations of a composite panel subjected to various impact conditions. Progressive damage at the constituent level is monitored throughout the loading. The results qualitatively illustrate the value of rate dependent material models for marine composite materials under extreme dynamic loading conditions.

  4. Elemental Analysis of Lapis Lazuli sample, using complementary techniques of IBIL and MicroPIXE

    OpenAIRE

    T Nikbakht; Kakuee, O. R.; M Lamehi Rachti; M Sedaghati Boorkhani

    2015-01-01

    Ion Beam Induced Luminescence (IBIL) is a useful IBA technique which could be utilized to obtain information about the nature of chemical bonds in materials. Regarding the probed area, this non-destructive and fast technique is a suitable complementary one for MicroPIXE. Since most minerals are luminescent, IBIL is an applicable analytical technique in mineralogy. In this research work, to characterize a Lapis lazuli sample, a 2.7 MeV proton beam is utilized. After data collection and analysi...

  5. Flow analysis techniques as effective tools for the improved environmental analysis of organic compounds expressed as total indices.

    Science.gov (United States)

    Maya, Fernando; Estela, José Manuel; Cerdà, Víctor

    2010-04-15

    The scope of this work is the accomplishment of an overview about the current state-of-the-art flow analysis techniques applied to the environmental determination of organic compounds expressed as total indices. Flow analysis techniques are proposed as effective tools for the quick obtention of preliminary chemical information about the occurrence of organic compounds on the environment prior to the use of more complex, time-consuming and expensive instrumental techniques. Recently improved flow-based methodologies for the determination of chemical oxygen demand, halogenated organic compounds and phenols are presented and discussed in detail. The aim of the present work is to demonstrate the highlight of flow-based techniques as vanguard tools on the determination of organic compounds in environmental water samples.

  6. Relationships between eigen and complex network techniques for the statistical analysis of climate data

    CERN Document Server

    Donges, Jonathan F; Loew, Alexander; Marwan, Norbert; Kurths, Jürgen

    2013-01-01

    Eigen techniques such as empirical orthogonal function (EOF) or coupled pattern (CP) analysis have been frequently used for detecting patterns in multivariate climatological data sets. Recently, statistical methods originating from the theory of complex networks have been employed for the very same purpose of spatio-temporal analysis. This climate network analysis is usually based on the same set of similarity matrices as is used in classical EOF or CP analysis, e.g., the correlation matrix of a single climatological field or the cross-correlation matrix between two distinct climatological fields. In this study, formal relationships between both eigen and network approaches are derived and illustrated using exemplary data sets. These results allow to pinpoint that climate network analysis can complement classical eigen techniques and provides substantial additional information on the higher-order structure of statistical interrelationships in climatological data sets. Hence, climate networks are a valuable su...

  7. Organic Tanks Safety Program: Advanced organic analysis FY 1996 progress report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-09-01

    Major focus during the first part of FY96 was to evaluate using organic functional group concentrations to screen for energetics. Fourier transform infrared and Raman spectroscopy would be useful screening tools for determining C-H and COO- organic content in tank wastes analyzed in a hot cell. These techniques would be used for identifying tanks of potential safety concern that may require further analysis. Samples from Tanks 241-C-106 and -C-204 were analyzed; the major organic in C-106 was B2EHPA and in C-204 was TBP. Analyses of simulated wastes were also performed for the Waste Aging Studies Task; organics formed as a result of degradation were identified, and the original starting components were monitored quantitatively. Sample analysis is not routine and required considerable methods adaptation and optimization. Several techniques have been evaluated for directly analyzing chelator and chelator fragments in tank wastes: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry and liquid chromatography with ultraviolet detection using Cu complexation. Although not directly funded by the Tanks Safety Program, the success of these techniques have implications for both the Flammable Gas and Organic Tanks Safety Programs.

  8. Diffusion MRI of the neonate brain: acquisition, processing and analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Pannek, Kerstin [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, School of Medicine, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); Guzzetta, Andrea [IRCCS Stella Maris, Department of Developmental Neuroscience, Calambrone Pisa (Italy); Colditz, Paul B. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Perinatal Research Centre, Brisbane (Australia); Rose, Stephen E. [University of Queensland, Centre for Clinical Research, Brisbane (Australia); University of Queensland, Centre for Advanced Imaging, Brisbane (Australia); University of Queensland Centre for Clinical Research, Royal Brisbane and Women' s Hospital, Brisbane (Australia)

    2012-10-15

    Diffusion MRI (dMRI) is a popular noninvasive imaging modality for the investigation of the neonate brain. It enables the assessment of white matter integrity, and is particularly suited for studying white matter maturation in the preterm and term neonate brain. Diffusion tractography allows the delineation of white matter pathways and assessment of connectivity in vivo. In this review, we address the challenges of performing and analysing neonate dMRI. Of particular importance in dMRI analysis is adequate data preprocessing to reduce image distortions inherent to the acquisition technique, as well as artefacts caused by head movement. We present a summary of techniques that should be used in the preprocessing of neonate dMRI data, and demonstrate the effect of these important correction steps. Furthermore, we give an overview of available analysis techniques, ranging from voxel-based analysis of anisotropy metrics including tract-based spatial statistics (TBSS) to recently developed methods of statistical analysis addressing issues of resolving complex white matter architecture. We highlight the importance of resolving crossing fibres for tractography and outline several tractography-based techniques, including connectivity-based segmentation, the connectome and tractography mapping. These techniques provide powerful tools for the investigation of brain development and maturation. (orig.)

  9. Damage analysis and fundamantal studies. Quarterly progress report, April--June 1978

    Energy Technology Data Exchange (ETDEWEB)

    Zwilsky, Klaus M.

    1979-05-01

    This report is the second in a series of Quarterly Technical Progress Reports on Damage Analysis and Fundamental Studies (DAFS) which is one element of the Fusion Reactor Materials Program, conducted in support of the Magnetic Fusion Energy Program. This report is organized along typical lines in parallel to a Program Plan of the same title (to be published) so that activities and accomplishments may be followed readily relative to that Program Plan. Thus, the work of a given laboratory may appear throughout the report. Chapters 1 and 2 report topics which are generic to all of the DAFS Program: DAFS Task Group Activities and Irradiation Test Facilities, respectively. Chapters 3, 4, and 5 report the work that is specific to each of the subtasks around which the program is structured: A) Environmental Characterization, B) Damage Production, and C) Damage Microstructure Evolution and Mechanical Behavior.

  10. [Geothermal system temperature-depth database and model for data analysis]. 5. quarterly technical progress report

    Energy Technology Data Exchange (ETDEWEB)

    Blackwell, D.D.

    1998-04-25

    During this first quarter of the second year of the contract activity has involved several different tasks. The author has continued to work on three tasks most intensively during this quarter: the task of implementing the data base for geothermal system temperature-depth, the maintenance of the WWW site with the heat flow and gradient data base, and finally the development of a modeling capability for analysis of the geothermal system exploration data. The author has completed the task of developing a data base template for geothermal system temperature-depth data that can be used in conjunction with the regional data base that he had already developed and is now implementing it. Progress is described.

  11. Progressive Fracture Analysis of Planar Lattices and Shape-Morphing Kagome Structure

    Science.gov (United States)

    Tserpes, Konstantinos I.

    The fracture behaviors of three defected planar lattices loaded in axial tension and the 3D shape-morphing Kagome structure loaded as a cantilever beam are explored by using finite element-based progressive fracture analysis. The assumed defects are in the form of symmetrical notches introduced in the lattices by removing the struts in single rows. Numerical results reveal that the presence of the notches significantly reduces the tensile strength of the lattices. On the other hand, with increasing the load in the Kagome structure, yielding and buckling of the struts in the core and yielding of the face-sheet appear consecutively inducing degradation of structure’s bending stiffness and large dips of the loaded end.

  12. Progressive damage analysis of carbon/epoxy laminates under couple laser and mechanical loading

    Science.gov (United States)

    Liu, Wanlei; Chang, Xinlong; Zhang, Xiaojun; Zhang, Youhong

    A multiscale model based bridge theory is proposed for the progressive damage analysis of carbon/epoxy laminates under couple laser and mechanical loading. The ablation model is adopted to calculate ablation temperature changing and ablation surface degradation. The polynomial strengthening model of matrix is used to improve bridging model for reducing parameter input. Stiffness degradation methods of bridging model are also improved in order to analyze the stress redistribution more accurately when the damage occurs. Thermal-mechanical analyses of the composite plate are performed using the ABAQUS/Explicit program with the developed model implemented in the VUMAT. The simulation results show that this model can be used to proclaim the mesoscale damage mechanism of composite laminates under coupled loading.

  13. [Application progress of laser-induced breakdown spectroscopy for surface analysis in materials science field].

    Science.gov (United States)

    Zhang, Yong; Jia, Yun-Hai; Chen, Ji-Wen; Liu, Ying; Shen, Xue-Jing; Zhao, Lei; Wang, Shu-Ming; Yu, Hong; Han, Peng-Cheng; Qu, Hua-Yang; Liu, Shao-Zun

    2012-06-01

    As a truly surface analytical tool, laser-induced breakdown spectroscopy (LIBS) was developed in recent ten years, and in this paper, fundamental theory, instrumentation and it's applications in material science are reviewed in detail. Application progress of elemental distribution and depth profile analysis are mainly discussed in the field of metallurgy, semiconductor and electronical materials at home and abroad. It is pointed out that the pulse energy, ambient gas and it's pressure, and energy distribution of laser beam strongly influence spatial and depth resolution, and meanwhile a approach to improving resolution considering analytical sensitivity is provided. Compared with traditional surface analytical methods, the advantage of LIBS is very large scanning area, high analytical speed, and that conducting materials or non-conducting materials both can be analyzed. It becomes a powerful complement of traditional surface analytical tool.

  14. Micromechanics-Based Progressive Failure Analysis of Composite Laminates Using Different Constituent Failure Theories

    Science.gov (United States)

    Moncada, Albert M.; Chattopadhyay, Aditi; Bednarcyk, Brett A.; Arnold, Steven M.

    2008-01-01

    Predicting failure in a composite can be done with ply level mechanisms and/or micro level mechanisms. This paper uses the Generalized Method of Cells and High-Fidelity Generalized Method of Cells micromechanics theories, coupled with classical lamination theory, as implemented within NASA's Micromechanics Analysis Code with Generalized Method of Cells. The code is able to implement different failure theories on the level of both the fiber and the matrix constituents within a laminate. A comparison is made among maximum stress, maximum strain, Tsai-Hill, and Tsai-Wu failure theories. To verify the failure theories the Worldwide Failure Exercise (WWFE) experiments have been used. The WWFE is a comprehensive study that covers a wide range of polymer matrix composite laminates. The numerical results indicate good correlation with the experimental results for most of the composite layups, but also point to the need for more accurate resin damage progression models.

  15. Nonlinear Progressive Failure Analysis of Surrounding Rock System Based on Similarity Theory

    Directory of Open Access Journals (Sweden)

    Zhao Y.

    2016-01-01

    Full Text Available Nonlinear progressive failure study of surrounding rock is important for the stability analysis of underground engineering projects. Taking a deep-buried tunnel in Chongqing as an example, a three dimensional(3-D physical model was established based on similarity theory. To satisfy similarity requirement of physical–mechanical properties, such as elastic modulus, compressive strength and Poisson ratio, physical model materials were developed. Using full inner-spy photograph technology, the deformation and failure process of rock were studied under the situation of independent and combined action of anchor, shotcrete and reinforcing mesh. Based on experimental results, the interaction mechanism between rock and support structure under high stress was investigated.

  16. Damage analysis and fundamental studies. Quarterly progress report, January--March 1979

    Energy Technology Data Exchange (ETDEWEB)

    Zwilsky, Klaus M.

    1979-05-01

    This report is the fifth in a series of Quarterly Technical Progress Reports on Damage Analysis and Fundamental Studies (DAFS) which is one element of the Fusion Reactor Materials Program, conducted in support of the Magnetic Fusion Energy Program. This report is organized along topical lines in parallel to a Program Plan of the same title (to be published) so that activities and accomplishments may be followed readily relative to that Program Plan. Thus, the work of a given laboratory may appear throughout the report. Chapters 1 and 2 report topics which are generic to all of the DAFS Program: DAFS Task Group Activities and Irradiation Test Facilities, respectively. Chapters 3, 4, and 5 report the work that is specific to each of the subtasks around which the program is structured: A) Environmental Characterization, B) Damage Production, and C) Damage Microstructure Evolution and Mechanical Behavior.

  17. A new technique of ECG analysis and its application to evaluation of disorders during ventricular tachycardia

    Energy Technology Data Exchange (ETDEWEB)

    Moskalenko, A.V. [Institute of Theoretical and Experimental Biophysics RAS, Institutskaya Street, 3, Pushchino 142290 (Russian Federation)], E-mail: info@avmoskalenko.ru; Rusakov, A.V. [Institute of Theoretical and Experimental Biophysics RAS, Institutskaya Street, 3, Pushchino 142290 (Russian Federation); Elkin, Yu.E. [Institute of Mathematical Problems of Biology RAS, Institutskaya Street, 4, Pushchino 142290 (Russian Federation)

    2008-04-15

    We propose a new technique of ECG analysis to characterize the properties of polymorphic ventricular arrhythmias, potentially life-threatening disorders of cardiac activation. The technique is based on extracting two indices from the ECG fragment. The result is a new detailed quantitative description of polymorphic ECGs. Our observations suggest that the proposed ECG processing algorithm provides information that supplements the traditional visual ECG analysis. The estimates of ECG variation in this study reveal some unexpected details of ventricular activation dynamics, which are possibly useful for diagnosing cardiac rhythm disturbances.

  18. Analysis of a Reflectarray by Using an Iterative Domain Decomposition Technique

    Directory of Open Access Journals (Sweden)

    Carlos Delgado

    2012-01-01

    Full Text Available We present an efficient method for the analysis of different objects that may contain a complex feeding system and a reflector structure. The approach is based on a domain decomposition technique that divides the geometry into several parts to minimize the vast computational resources required when applying a full wave method. This technique is also parallelized by using the Message Passing Interface to minimize the memory and time requirements of the simulation. A reflectarray analysis serves as an example of the proposed approach.

  19. Research progress in rehabilitation treatment of stroke patients A bibliometric analysis

    Institute of Scientific and Technical Information of China (English)

    Xiaodong Feng; Chengmei Liu; Qingchuan Guo; Yanjie Bai; Yafeng Ren; Binbin Ren; Junmin Bai; Lidian Chen

    2013-01-01

    BACKGROUND: Stroke presents as a transient or chronic brain dysfunction and is associated with high morbidity and high mortality. The doctors and scientists would like to argue how to enhance the validity of the rehabilitation treatment and how to further improve the level of treatment on stroke.OBJECTIVE: The aim of this study was to quantitatively analyze the current worldwide progress in research on stroke rehabilitation treatment based on Web of Science database and ClinicalTrial.gov in the past 10 years.METHODS: We conducted a quantitative analysis of clinical trial articles regarding stroke rehabilitation published in English from 2003 to 2013 and indexed in the National Institutes of Health Clinical Trials registry and Web of Science databases. Data were downloaded on March 15, 2013. RESULTS: (1) From 2003 to 2013, 2 654 clinical trials investigating stroke were indexed in ClinicalTrials.gov. There were only 58 clinical trials registered in 2003, and there was a marked increase from 2005. A total of 605 clinical trials on the rehabilitation of stroke were conducted in the past 10 years. (2) The analysis showed that most of the trials in the field were registered by North American institutions. With respect to the Asian countries, China and Taiwan area of China also published a reasonable proportion of the trials, but comparatively speaking, the number of trials is really rare. Most of the interventions were drugs, followed by the devices, and behavioral interventions were ranked third. (3) In the past 10 years, there were 4 052 studies on stroke indexed by Web of Science database. CONCLUSION: From perspective of research progress, we found that the number of clinical trials and papers on stroke rehabilitation has increased significantly in the past 10 years, between them a remarkable positive correlation exists.

  20. The relationship between disease activity and radiologic progression in patients with rheumatoid arthritis: a longitudinal analysis.

    NARCIS (Netherlands)

    Welsing, P.M.J.; Landewe, R.B.; Riel, P.L.C.M. van; Boers, M.; Gestel, A.M. van; Linden, S.G. van der; Swinkels, H.L.; Heijde, D.M.F.M. van der

    2004-01-01

    OBJECTIVE: Radiologic progression in rheumatoid arthritis (RA) is considered the consequence of persistent inflammatory activity. To determine whether a change in disease activity is related to a change in radiologic progression in individual patients, we investigated the longitudinal relationship b

  1. Analysis on Thematic Structure and Thematic Progression in Obama's weekly radio speech

    Institute of Scientific and Technical Information of China (English)

    李成烨

    2016-01-01

    Thematic structure and thematic progression are important concept of textual function. Based on the example of Obama's weekly radio speech, we can examine the practicality of thematic structure and thematic progression and give a guidance for preparation of political speeches.

  2. A phenomenological analysis of melt progression in the lower head of a pressurized water reactor

    Energy Technology Data Exchange (ETDEWEB)

    Seiler, J.M., E-mail: jean-marie.seiler@cea.fr [CEA, DEN, DTN, F-38054 Grenoble (France); Tourniaire, B. [EDF/Septen, Lyon (France)

    2014-03-15

    Highlights: • We propose a phenomenological description of melt progression into the lower head. • We examine changes in heat loads on the vessel. • Heat loads are more severe than emphasized by the bounding situation assumption. • Both primary circuit and ex-vessel reflooding are necessary for in-vessel retention. • Vessel failure conditions are examined. - Abstract: The analysis of in-vessel corium cooling (IVC) and retention (IVR) involves the description of very complex and transient physical phenomena. To get round this difficulty, “bounding” situations are often emphasized for the demonstration of corium coolability, by vessel flooding and/or by reactor pit flooding. This approach however comes up against its own limitations. More realistic melt progression scenarios are required to provide plausible corium configurations and vessel failure conditions. Work to develop more realistic melt progression scenarios has been done at CEA, in collaboration with EDF. Development has concentrated on the French 1300 MWe PWR, considering both dry scenarios and the possibility of flooding of the RPC (reactor primary circuit) and/or the reactor pit. The models used for this approach have been derived from the analysis of the TMI2 accident and take benefit from the lessons derived from several programs related to pool thermal hydraulics (BALI, COPO, ACOPO, etc.), material interactions (RASPLAV, MASCA), critical heat flux (CHF) on the external surface of the vessel (KAIST, SULTAN, ULPU), etc. Important conclusions of this work are as follows: (a)After the start of corium melting and onset of melt formation in the core at low pressure (∼1 to 5 bars), it seems questionable that RPV (reactor pressure vessel) reflooding alone would be sufficient to achieve corium retention in the vessel; (b)If the vessel is not cooled externally, it may fail due to local heat-up before the whole core fuel inventory is relocated in the lower head; (c)Even if the vessel is

  3. Mapping a decade of linked data progress through co-word analysis

    Directory of Open Access Journals (Sweden)

    Massoomeh Niknia

    2015-12-01

    Full Text Available Linked data describes a method of publishing structured data which it can be interlinked and become more effective through semantic queries. This enables data from different sources to be connected and queried. It builds upon standard web technologies such as HTTP, RDF and URIs. This method helps human readers to share information in a way that can be read automatically by computers. Regarding the importance of Linked data, the main aim of this article is visualizing scientific mapping of linked data to show its progress through one decade. The scientometric study employs hierarchical cluster analysis, strategic diagrams and network analysis to map and visualize the linked data landscape of the "Scopus" publications through the use of co-word analysis. The study quantifies and describes the thematic evolution of the field based on a total of 717 Scopus articles and their associated 19977 keywords published between 1970s and 2014. According to the results the thematic visualization and the clusters show most concepts concentrated around computer related terms, such as big data; cloud computing semantic data; semantic technologies; semantic web; artificial intelligence; computer programming; semantic search, etc. In addition, we found that in recent years after librarians and information scientists doing researches in linked data on the behalf of computer scientist the “user” studies became important.

  4. Activated sludge characterization through microscopy: A review on quantitative image analysis and chemometric techniques

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Daniela P. [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Amaral, A. Luís [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal); Instituto Politécnico de Coimbra, ISEC, DEQB, Rua Pedro Nunes, Quinta da Nora, 3030-199 Coimbra (Portugal); Ferreira, Eugénio C., E-mail: ecferreira@deb.uminho.pt [IBB-Institute for Biotechnology and Bioengineering, Centre of Biological Engineering, Universidade do Minho, Campus de Gualtar, 4710-057 Braga (Portugal)

    2013-11-13

    Graphical abstract: -- Highlights: •Quantitative image analysis shows potential to monitor activated sludge systems. •Staining techniques increase the potential for detection of operational problems. •Chemometrics combined with quantitative image analysis is valuable for process monitoring. -- Abstract: In wastewater treatment processes, and particularly in activated sludge systems, efficiency is quite dependent on the operating conditions, and a number of problems may arise due to sludge structure and proliferation of specific microorganisms. In fact, bacterial communities and protozoa identification by microscopy inspection is already routinely employed in a considerable number of cases. Furthermore, quantitative image analysis techniques have been increasingly used throughout the years for the assessment of aggregates and filamentous bacteria properties. These procedures are able to provide an ever growing amount of data for wastewater treatment processes in which chemometric techniques can be a valuable tool. However, the determination of microbial communities’ properties remains a current challenge in spite of the great diversity of microscopy techniques applied. In this review, activated sludge characterization is discussed highlighting the aggregates structure and filamentous bacteria determination by image analysis on bright-field, phase-contrast, and fluorescence microscopy. An in-depth analysis is performed to summarize the many new findings that have been obtained, and future developments for these biological processes are further discussed.

  5. 联用技术在砷形态分析中的研究进展%Research Progress in the Speciation of Arsenic with Hyphenated Techniques

    Institute of Scientific and Technical Information of China (English)

    李林林; 朱英存

    2013-01-01

    Currently the main analysis techniques for arsenic speciation were the combination of chromatograph with atomic spectrum and mass spectrum, especially the combination of HPLC with ICP-OES or ICP-MS. The methods were accurate and reliable and had high sensitivity. But they can not provide the structural information of arsenic compounds. So the combination of chromatograph with ESI-MS was used as the main method for qualitative analysis. The insufficient sensitivity and vulnerability to matrix interference of ESI-MS can be overcome by the complex pretreatment technique. In this article hyphenated techniques were evaluated with particular emphasis on interfaced separation with element-selective detection and identification of the arsenic compounds detected.%  目前用元素砷的形态分析技术主要是色谱与子光谱、质谱等的联用,最常用的是HPLC与ICP-OES、 ICP-MS的联用,方法准确可靠,灵敏度高,但是不能提供砷化合物的分子结构信息。此以色谱-电喷雾质谱(ESI-MS)的联用作为定性分析的主要手段,但是ESI-MS灵敏度低,易受基体干扰,此需要复杂的前处理技术。

  6. New Techniques in Time-Frequency Analysis: Adaptive Band, Ultra-Wide Band and Multi-Rate Signal Processing

    Science.gov (United States)

    2016-03-02

    There are numerous motivations for extending signal processing, and in particular, sampling theory , to non- Euclidean spaces, and in particular...AVAILABILITY STATEMENT Unlimited DISTRIBUTION A 13. SUPPLEMENTARY NOTES 14. ABSTRACT The project led to the development of new techniques and theories ...in the analysis of signals. These techniques and theories were extensions of known techniques -- sampling, Fourier, Gabor and wavelet analysis, and

  7. Automated Techniques for the Qualitative Analysis of Ecological Models: Continuous Models

    Directory of Open Access Journals (Sweden)

    Lynn van Coller

    1997-06-01

    Full Text Available The mathematics required for a detailed analysis of the behavior of a model can be formidable. In this paper, I demonstrate how various computer packages can aid qualitative analyses by implementing techniques from dynamical systems theory. Because computer software is used to obtain the results, the techniques can be used by nonmathematicians as well as mathematicians. In-depth analyses of complicated models that were previously very difficult to study can now be done. Because the paper is intended as an introduction to applying the techniques to ecological models, I have included an appendix describing some of the ideas and terminology. A second appendix shows how the techniques can be applied to a fairly simple predator-prey model and establishes the reliability of the computer software. The main body of the paper discusses a ratio-dependent model. The new techniques highlight some limitations of isocline analyses in this three-dimensional setting and show that the model is structurally unstable. Another appendix describes a larger model of a sheep-pasture-hyrax-lynx system. Dynamical systems techniques are compared with a traditional sensitivity analysis and are found to give more information. As a result, an incomplete relationship in the model is highlighted. I also discuss the resilience of these models to both parameter and population perturbations.

  8. Advanced spatio-temporal filtering techniques for photogrammetric image sequence analysis in civil engineering material testing

    Science.gov (United States)

    Liebold, F.; Maas, H.-G.

    2016-01-01

    The paper shows advanced spatial, temporal and spatio-temporal filtering techniques which may be used to reduce noise effects in photogrammetric image sequence analysis tasks and tools. As a practical example, the techniques are validated in a photogrammetric spatio-temporal crack detection and analysis tool applied in load tests in civil engineering material testing. The load test technique is based on monocular image sequences of a test object under varying load conditions. The first image of a sequence is defined as a reference image under zero load, wherein interest points are determined and connected in a triangular irregular network structure. For each epoch, these triangles are compared to the reference image triangles to search for deformations. The result of the feature point tracking and triangle comparison process is a spatio-temporally resolved strain value field, wherein cracks can be detected, located and measured via local discrepancies. The strains can be visualized as a color-coded map. In order to improve the measuring system and to reduce noise, the strain values of each triangle must be treated in a filtering process. The paper shows the results of various filter techniques in the spatial and in the temporal domain as well as spatio-temporal filtering techniques applied to these data. The best results were obtained by a bilateral filter in the spatial domain and by a spatio-temporal EOF (empirical orthogonal function) filtering technique.

  9. 乙醇浓醪发酵技术研究进展%Research progress of high-concentration mash ethanol fermentation techniques

    Institute of Scientific and Technical Information of China (English)

    张强; 韩德明; 李明堂

    2014-01-01

    乙醇浓醪发酵具有高细胞密度、高产物浓度和高生产速率等特点,是乙醇工业的发展目标和方向。采用乙醇浓醪发酵技术,具有节约工艺用水、提高设备利用率、降低能耗等优势,是提高乙醇发酵工业效益的重要途径。研究乙醇浓醪发酵具有十分重要的现实意义。本文综述了乙醇浓醪发酵技术研究进展,介绍了乙醇浓醪发酵定义、优势以及影响乙醇浓醪发酵的因素。指出降低发酵醪液黏度、筛选高耐性酿酒酵母、改变发酵工艺模式、添加适宜酶制剂以及营养物质是实现乙醇浓醪发酵技术的主要途径,其中筛选高耐性的酿酒酵母是实现乙醇浓醪发酵技术的关键。%High-concentration mash fermentation has such characteristics as high cell density,high product concentration and high production rates. So it is the future aim of ethanol industry. High-concentration mash ethanol fermentation techniques have the advantages of saving process water,improving equipment utilization,and reducing energy consumption. So high-concentration mash ethanol fermentation techniques are an important way of improving ethanol production efficiency. Research on high-concentration mash ethanol fermentation techniques is very important in practical application. This paper reviews research progress of high-concentration mash ethanol fermentation techniques. The definition,advantages and influencing factors of high-concentration mash ethanol fermentation are introduced. Lowering the viscosity of fermentation mash,screening high-tolerance ethanol yeast,changing fermentation process model and adding appropriate enzymes and nutrients are important methods of achieving high-concentration mash ethanol fermentation. Among them,screening high-tolerance ethanol yeast is the key to high-concentration mash fermentation.

  10. Real-time flight test analysis and display techniques for the X-29A aircraft

    Science.gov (United States)

    Hicks, John W.; Petersen, Kevin L.

    1989-01-01

    The X-29A advanced technology demonstrator flight envelope expansion program and the subsequent flight research phase gave impetus to the development of several innovative real-time analysis and display techniques. These new techniques produced significant improvements in flight test productivity, flight research capabilities, and flight safety. These techniques include real-time measurement and display of in-flight structural loads, dynamic structural mode frequency and damping, flight control system dynamic stability and control response, aeroperformance drag polars, and aircraft specific excess power. Several of these analysis techniques also provided for direct comparisons of flight-measured results with analytical predictions. The aeroperformance technique was made possible by the concurrent development of a new simplified in-flight net thrust computation method. To achieve these levels of on-line flight test analysis, integration of ground and airborne systems was required. The capability of NASA Ames Research Center, Dryden Flight Research Facility's Western Aeronautical Test Range was a key factor to enable implementation of these methods.

  11. An Analysis Technique/Automated Tool for Comparing and Tracking Analysis Modes of Different Finite Element Models

    Science.gov (United States)

    Towner, Robert L.; Band, Jonathan L.

    2012-01-01

    An analysis technique was developed to compare and track mode shapes for different Finite Element Models. The technique may be applied to a variety of structural dynamics analyses, including model reduction validation (comparing unreduced and reduced models), mode tracking for various parametric analyses (e.g., launch vehicle model dispersion analysis to identify sensitivities to modal gain for Guidance, Navigation, and Control), comparing models of different mesh fidelity (e.g., a coarse model for a preliminary analysis compared to a higher-fidelity model for a detailed analysis) and mode tracking for a structure with properties that change over time (e.g., a launch vehicle from liftoff through end-of-burn, with propellant being expended during the flight). Mode shapes for different models are compared and tracked using several numerical indicators, including traditional Cross-Orthogonality and Modal Assurance Criteria approaches, as well as numerical indicators obtained by comparing modal strain energy and kinetic energy distributions. This analysis technique has been used to reliably identify correlated mode shapes for complex Finite Element Models that would otherwise be difficult to compare using traditional techniques. This improved approach also utilizes an adaptive mode tracking algorithm that allows for automated tracking when working with complex models and/or comparing a large group of models.

  12. Comparative Analysis of Various Image Fusion Techniques For Biomedical Images: A Review

    Directory of Open Access Journals (Sweden)

    Nayera Nahvi,

    2014-05-01

    Full Text Available Image Fusion is a process of combining the relevant information from a set of images, into a single image, wherein the resultant fused image will be more informative and complete than any of the input images. This paper discusses implementation of DWT technique on different images to make a fused image having more information content. As DWT is the latest technique for image fusion as compared to simple image fusion and pyramid based image fusion, so we are going to implement DWT as the image fusion technique in our paper. Other methods such as Principal Component Analysis (PCA based fusion, Intensity hue Saturation (IHS Transform based fusion and high pass filtering methods are also discussed. A new algorithm is proposed using Discrete Wavelet transform and different fusion techniques including pixel averaging, min-max and max-min methods for medical image fusion. KEYWORDS:

  13. A hybrid fringe analysis technique for the elimination of random noise in interferometric wrapped phase maps

    Science.gov (United States)

    Bhat, Gopalakrishna K.

    1994-10-01

    A fringe analysis technique, which makes use of the spatial filtering property of the Fourier transform method, for the elimination of random impulsive noise in the wrapped phase maps obtained using the phase stepping technique, is presented. Phase noise is converted into intensity noise by transforming the wrapped phase map into a continuous fringe pattern inside the digital image processor. Fourier transform method is employed to filter out the intensity noise and recover the clean wrapped phase map. Computer generated carrier fringes are used to preserve the sign information. This technique makes the two dimensional phase unwrapping process less involved, because it eliminates the local phase fluctuations, which act as pseudo 2π discontinuities. The technique is applied for the elimination of noise in a phase map obtained using electro-optic holography.

  14. Rhinoplasty - analysis of the techniques used in a service in the south of Brazil

    Directory of Open Access Journals (Sweden)

    Pasinato, Rogério C

    2008-09-01

    Full Text Available Introduction: In the rhinoplasty, as in other surgeries, an adequate exposure of the manipulated structures is essential for a positive surgical result. Various techniques are used, and these may vary, mainly, because of the anatomical alterations found. Objective: To evaluate which are the most common surgical techniques and maneuver used in our service. Method: Retrospective analysis of the surgical descriptions of patients submitted to the rhinoplasty in the Otorhinolaryngology Department of the Clinical Hospital - UFPR in the year of 2007. Results: 79 patients were evaluated; in 86% of whom rhinoplasty with basic technique was performed, between 6,4% and 7,6% delivery and external rhinoplasty were used, respectively. Conclusion: In our service we performed basic technique rhinoplasty in the great majority of the patients.

  15. Static progressive versus three-point elbow extension splinting: a mathematical analysis.

    Science.gov (United States)

    Chinchalkar, Shrikant J; Pearce, Joshua; Athwal, George S

    2009-01-01

    Elbow joint contractures are often treated by using static progressive, dynamic, turnbuckle, or serial static splinting. These splint designs are effective in regaining functional elbow range of motion due to the high forces applied to the contracted tissues; however, regaining terminal elbow extension remains a challenge. Static progressive splints are commonly used to initiate treatment, however, are considered less effective in regaining terminal extension. Recently, the concept of converting a static progressive splint into a three-point static progressive splint (TPSPS) to regain terminal extension has been introduced. This paper mathematically analyzes the compressive and rotational forces in static progressive and TPSPSs. Our hypothesis was that three-point static progressive splinting was superior to the standard static progressive elbow extension splint in applying rotational forces to the elbow at terminal extension.

  16. Exploring the potential of data mining techniques for the analysis of accident patterns

    DEFF Research Database (Denmark)

    Prato, Carlo Giacomo; Bekhor, Shlomo; Galtzur, Ayelet

    2010-01-01

    Research in road safety faces major challenges: individuation of the most significant determinants of traffic accidents, recognition of the most recurrent accident patterns, and allocation of resources necessary to address the most relevant issues. This paper intends to comprehend which data mining...... and association rules) data mining techniques are implemented for the analysis of traffic accidents occurred in Israel between 2001 and 2004. Results show that descriptive techniques are useful to classify the large amount of analyzed accidents, even though introduce problems with respect to the clear...... importance of input and intermediate neurons, and the relative importance of hundreds of association rules. Further research should investigate whether limiting the analysis to fatal accidents would simplify the task of data mining techniques in recognizing accident patterns without the “noise” probably...

  17. THE RESEARCH TECHNIQUES FOR ANALYSIS OF MECHANICAL AND TRIBOLOGICAL PROPERTIES OF COATING-SUBSTRATE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kinga CHRONOWSKA-PRZYWARA

    2014-06-01

    Full Text Available The article presents research techniques for the analysis of both mechanical and tribological properties of thin coatings applied on highly loaded machine elements. In the Institute of Machine Design and Exploitation, AGH University of Science and Technology students of the second level of Mechanical Engineering study tribology attending laboratory class. Students learn on techniques for mechanical and tribological testing of thin, hard coatings deposited by PVD and CVD technologies. The program of laboratories contains micro-, nanohardness and Young's modulus measurements by instrumental indentations and analysys of coating to substrate adhesion by scratch testing. The tribological properties of the coating-substrate systems are studied using various techniques, mainly in point contact load conditions with ball-on-disc and block-on-ring tribomiters as well as using ball cratering method in strongly abrasive suspensions.

  18. Advanced analysis technique for the evaluation of linear alternators and linear motors

    Science.gov (United States)

    Holliday, Jeffrey C.

    1995-01-01

    A method for the mathematical analysis of linear alternator and linear motor devices and designs is described, and an example of its use is included. The technique seeks to surpass other methods of analysis by including more rigorous treatment of phenomena normally omitted or coarsely approximated such as eddy braking, non-linear material properties, and power losses generated within structures surrounding the device. The technique is broadly applicable to linear alternators and linear motors involving iron yoke structures and moving permanent magnets. The technique involves the application of Amperian current equivalents to the modeling of the moving permanent magnet components within a finite element formulation. The resulting steady state and transient mode field solutions can simultaneously account for the moving and static field sources within and around the device.

  19. Breath Analysis Using Laser Spectroscopic Techniques: Breath Biomarkers, Spectral Fingerprints, and Detection Limits

    Directory of Open Access Journals (Sweden)

    Peeyush Sahay

    2009-10-01

    Full Text Available Breath analysis, a promising new field of medicine and medical instrumentation, potentially offers noninvasive, real-time, and point-of-care (POC disease diagnostics and metabolic status monitoring. Numerous breath biomarkers have been detected and quantified so far by using the GC-MS technique. Recent advances in laser spectroscopic techniques and laser sources have driven breath analysis to new heights, moving from laboratory research to commercial reality. Laser spectroscopic detection techniques not only have high-sensitivity and high-selectivity, as equivalently offered by the MS-based techniques, but also have the advantageous features of near real-time response, low instrument costs, and POC function. Of the approximately 35 established breath biomarkers, such as acetone, ammonia, carbon dioxide, ethane, methane, and nitric oxide, 14 species in exhaled human breath have been analyzed by high-sensitivity laser spectroscopic techniques, namely, tunable diode laser absorption spectroscopy (TDLAS, cavity ringdown spectroscopy (CRDS, integrated cavity output spectroscopy (ICOS, cavity enhanced absorption spectroscopy (CEAS, cavity leak-out spectroscopy (CALOS, photoacoustic spectroscopy (PAS, quartz-enhanced photoacoustic spectroscopy (QEPAS, and optical frequency comb cavity-enhanced absorption spectroscopy (OFC-CEAS. Spectral fingerprints of the measured biomarkers span from the UV to the mid-IR spectral regions and the detection limits achieved by the laser techniques range from parts per million to parts per billion levels. Sensors using the laser spectroscopic techniques for a few breath biomarkers, e.g., carbon dioxide, nitric oxide, etc. are commercially available. This review presents an update on the latest developments in laser-based breath analysis.

  20. Analysis and assessment of bridge health monitoring mass data—progress in research/development of "Structural Health Monitoring"

    Institute of Scientific and Technical Information of China (English)

    LI AiQun; DING YouLiang; WANG Hao; GUO Tong

    2012-01-01

    The "Structural Health Monitoring" is a project supported by National Natural Science Foundation for Distinguished Young Scholars of China (Grant No.50725828).To meet the urgent requirements of analysis and assessment of mass monitoring data of bridge environmental actions and structural responses,the monitoring of environmental actions and action effect modeling methods,dynamic performance monitoring and early warning methods,condition assessment and operation maintenance methods of key members are systematically studied in close combination with structural characteristics of long-span cable-stayed bridges and suspension bridges.The paper reports the progress of the project as follows.(1) The environmental action modeling methods of long-span bridges are established based on monitoring data of temperature,sustained wind and typhoon.The action effect modeling methods are further developed in combination with the multi-scale baseline finite element modeling method for long-span bridges.(2) The identification methods of global dynamic characteristics and internal forces of cables and hangers for long-span cable-stayed bridges and suspension bridges are proposed using the vibration monitoring data,on the basis of which the condition monitoring and early warning methods of bridges are developed using the environmental-condition-normalization technique.(3) The analysis methods for fatigue loading effect of welded details of steel box girder,temperature and traffic loading effect of expansion joint are presented based on long-term monitoring data of strain and beam-end displacement,on the basis of which the service performance assessment and remaining life prediction methods are developed.