WorldWideScience

Sample records for analysis techniques progress

  1. Research Progress on Pesticide Residue Analysis Techniques in Agro-products

    Directory of Open Access Journals (Sweden)

    HE Ze-ying

    2016-07-01

    Full Text Available There are constant occurrences of acute pesticide poisoning among consumers and pesticide residue violations in agro-products import/export trading. Pesticide residue analysis is the important way to protect the food safety and the interest of import/export enterprises. There has been a rapid development in pesticide residue analysis techniques in recent years. In this review, the research progress in the past five years were discussed in the respects of samples preparation and instrument determination. The application, modification and development of the QuEChERS method in samples preparation and the application of tandem mass spectrometry and high resolution mass spectrometry were reviewed. And the implications for the future of the field were discussed.

  2. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1977--July 31, 1978

    International Nuclear Information System (INIS)

    Finston, H.L.; Williams, E.T.

    1978-07-01

    The study of neutron-absorption cross sections of reactor-produced radionuclides has been completed, and results are reported for 22 Na, 126 I, 139 Ce, 88 Y, 184 Re, 182 Ta, 54 Mn, and 94 Zr. The results for 22 Na indicate the existence of a resonance in the thermal region which could explain the discrepancies in the published values for the thermal cross section. The results of air-sampling experiments are described as is the proton-induced x-ray emission system developed at Brooklyn College. Work on sample preparation and applications of the PIXE technique are given. Progress on a nuclear method to determine fluorine-containing gaseous compounds is reported. Work on solvent extraction with propylene carbonate and experiments involving an acid-base hypothesis are described

  3. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1978-July 31, 1979

    International Nuclear Information System (INIS)

    Finston, H.L.; Williams, E.T.

    1979-07-01

    Studies of homogeneous liquid-liquid extraction have been extended to include (1) a detailed determination of the phase diagram of the system propylene carbonate-water, (2) the extraction of a large variety of both monodentate and bidentate iron complexes, (3) the solvent extraction characteristics of analogues of propylene carbonate, (4) the behavior under pressure of the propylene carbonate water system, and (5) the extraction behavior of alkaline earth - TTA chelates. One consequence of these studies was the observation that the addition of ethanol to propylene carbonate-water or to isobutylene carbonate-water yields a single homogeneous phase. Subsequent evaporation of the ethanol restores the two immiscible phases. Past neutron activation analysis has been attempted for the heavy elements Pb, Bi, Tl at the Brookhaven HFBR (in- or near-core position) and at the Brookhaven CLIF facility. The latter appears more promising and we have initiated a collaborative program to use the CLIF facility. A milking system which can provide ca. 16 μCi of carrier-free 212 Pb was developed for use in an isotope dilution technique for lead. Collaboration with laboratories already determining trace lead by flameless Atomic Absorption or by concentration by electrodeposition into a hanging drop followed by Anodic stripping will be proposed. The Proton X-Ray Emission system has undergone marked improvement with the acquisition of a new high resolution Si(Li) detector and a new multi-channel analyzer system. Various techniques have been explored to dissolve and prepare samples for PIXE analysis and also for verification by Atomic Absorption analysis

  4. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1978-July 31, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Finston, H. L.; Williams, E. T.

    1979-07-01

    Studies of homogeneous liquid-liquid extraction have been extended to include (1) a detailed determination of the phase diagram of the system propylene carbonate-water, (2) the extraction of a large variety of both monodentate and bidentate iron complexes, (3) the solvent extraction characteristics of analogues of propylene carbonate, (4) the behavior under pressure of the propylene carbonate water system, and (5) the extraction behavior of alkaline earth - TTA chelates. One consequence of these studies was the observation that the addition of ethanol to propylene carbonate-water or to isobutylene carbonate-water yields a single homogeneous phase. Subsequent evaporation of the ethanol restores the two immiscible phases. Past neutron activation analysis has been attempted for the heavy elements Pb, Bi, Tl at the Brookhaven HFBR (in- or near-core position) and at the Brookhaven CLIF facility. The latter appears more promising and we have initiated a collaborative program to use the CLIF facility. A milking system which can provide ca. 16 ..mu..Ci of carrier-free /sup 212/Pb was developed for use in an isotope dilution technique for lead. Collaboration with laboratories already determining trace lead by flameless Atomic Absorption or by concentration by electrodeposition into a hanging drop followed by Anodic stripping will be proposed. The Proton X-Ray Emission system has undergone marked improvement with the acquisition of a new high resolution Si(Li) detector and a new multi-channel analyzer system. Various techniques have been explored to dissolve and prepare samples for PIXE analysis and also for verification by Atomic Absorption analysis.

  5. Progress in thin film techniques

    International Nuclear Information System (INIS)

    Weingarten, W.

    1996-01-01

    Progress since the last Workshop is reported on superconducting accelerating RF cavities coated with thin films. The materials investigated are Nb, Nb 3 Sn, NbN and NbTiN, the techniques applied are diffusion from the vapour phase (Nb 3 Sn, NbN), the bronze process (Nb 3 Sn), and sputter deposition on a copper substrate (Nb, NbTiN). Specially designed cavities for sample evaluation by RF methods have been developed (triaxial cavity). New experimental techniques to assess the RF amplitude dependence of the surface resistance are presented (with emphasis on niobium films sputter deposited on copper). Evidence is increasing that they are caused by magnetic flux penetration into the surface layer. (R.P.)

  6. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1976--July 31, 1977

    International Nuclear Information System (INIS)

    Finston, H.L.; Williams, E.T.

    1977-07-01

    The neutron temperatures of the BNL Medical Reactor and in the V-11 facility of the HFBR were determined to be 48 0 C and 39 0 C, respectively. 22 Na resonance energy is less than 0.01 eV. The development of PIXE (Proton Induced X-Ray Emission) technique and operation of the Dynamitron have reached the point where large numbers of samples can be routinely analyzed with high sensitivity. Aerosal samples are being collected in downtown Manhattan; being analyzed for the various constituents as a function of particle size, time and elevation. Samples of marine sediment and sludge are also being analyzed by PIXE for their metal content. PIXE technique has also been applied for analyses of a variety of archaeological artifacts, human tissue, paint, and geological specimens. The metastable isomers /sup 204m/Pb, /sup 199m/Hg, /sup 111m/Cd, /sup 115m/In and /sup 87m/Sr have been produced by inelastic neutron scattering by 3 MeV neutrons and the excitation functions determined. A study of the reaction 19 F + P → α + 16 O + γ for the determination of ''freons'' in the atmosphere has been initiated. The study of hydrogen overvoltage at single crystal electrodes will be continued. The effect of varying dichromate concentration on the pH of perchloric acid solutions is the same as previously observed for hydrochloric acid solutions, proving that it was not an effect of the medium. The effect on pH has also been verified over a wider pH range up to 4.4. Electrophoresis studies of chromium species show 3 different species; with one specie 1 at pH 1, one at pH 2 to 10.5, and one at pH more than 4.60. Homogeneous solvent extraction of Fe(III), Cu(II), Ni(II), Zn(II), Pb(II), Cd(II), Ca(II) Ba(II) and Mg(II) with solutions of TTA in propylene carbonate have been investigated

  7. Progress in automation, robotics and measuring techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2015-01-01

    This book presents recent progresses in control, automation, robotics, and measuring techniques. It includes contributions of top experts in the fields, focused on both theory and industrial practice. The particular chapters present a deep analysis of a specific technical problem which is in general followed by a numerical analysis and simulation, and results of an implementation for the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be useful for both researchers working in the area of engineering sciences and for practitioners solving industrial problems.    .

  8. Progress in diagnostic techniques for sc cavities

    International Nuclear Information System (INIS)

    Reece, C.E.

    1988-01-01

    While routinely achieved performance characteristics of superconducting cavities have now reached a level which makes them useful in large scale applications, achieving this level has come only through the knowledge gained by systematic studies of performance limiting phenomena. Despite the very real progress that has been made, the routine performance of superconducting cavities still falls far short of both the theoretical expectations and the performance of a few exception examples. It is the task of systematically applied diagnostic techniques to reveal additional information concerning the response of superconducting surfaces to applied RF fields. Here recent developments in diagnostic techniques are discussed. 18 references, 12 figures

  9. Granulation techniques and technologies: recent progresses.

    Science.gov (United States)

    Shanmugam, Srinivasan

    2015-01-01

    Granulation, the process of particle enlargement by agglomeration technique, is one of the most significant unit operations in the production of pharmaceutical dosage forms, mostly tablets and capsules. Granulation process transforms fine powders into free-flowing, dust-free granules that are easy to compress. Nevertheless, granulation poses numerous challenges due to high quality requirement of the formed granules in terms of content uniformity and physicochemical properties such as granule size, bulk density, porosity, hardness, moisture, compressibility, etc. together with physical and chemical stability of the drug. Granulation process can be divided into two types: wet granulation that utilize a liquid in the process and dry granulation that requires no liquid. The type of process selection requires thorough knowledge of physicochemical properties of the drug, excipients, required flow and release properties, to name a few. Among currently available technologies, spray drying, roller compaction, high shear mixing, and fluid bed granulation are worth of note. Like any other scientific field, pharmaceutical granulation technology also continues to change, and arrival of novel and innovative technologies are inevitable. This review focuses on the recent progress in the granulation techniques and technologies such as pneumatic dry granulation, reverse wet granulation, steam granulation, moisture-activated dry granulation, thermal adhesion granulation, freeze granulation, and foamed binder or foam granulation. This review gives an overview of these with a short description about each development along with its significance and limitations.

  10. Progress involving new techniques for liposome preparation

    Directory of Open Access Journals (Sweden)

    Zhenjun Huang

    2014-08-01

    Full Text Available The article presents a review of new techniques being used for the preparation of liposomes. A total of 28 publications were examined. In addition to the theories, characteristics and problems associated with traditional methods, the advantages and drawbacks of the latest techniques were reviewed. In the light of developments in many relevant areas, a variety of new techniques are being used for liposome preparation and each of these new technique has particular advantages over conventional preparation methods. However, there are still some problems associated with these new techniques that could hinder their applications and further improvements are needed. Generally speaking, due to the introduction of these latest techniques, liposome preparation is now an improved procedure. These applications promote not only advances in liposome research but also the methods for their production on an industrial scale.

  11. Granulation techniques and technologies: recent progresses

    OpenAIRE

    Shanmugam, Srinivasan

    2015-01-01

    Granulation, the process of particle enlargement by agglomeration technique, is one of the most significant unit operations in the production of pharmaceutical dosage forms, mostly tablets and capsules. Granulation process transforms fine powders into free-flowing, dust-free granules that are easy to compress. Nevertheless, granulation poses numerous challenges due to high quality requirement of the formed granules in terms of content uniformity and physicochemical proper...

  12. Automatic ultrasound technique to measure angle of progression during labor.

    Science.gov (United States)

    Conversano, F; Peccarisi, M; Pisani, P; Di Paola, M; De Marco, T; Franchini, R; Greco, A; D'Ambrogio, G; Casciaro, S

    2017-12-01

    To evaluate the accuracy and reliability of an automatic ultrasound technique for assessment of the angle of progression (AoP) during labor. Thirty-nine pregnant women in the second stage of labor, with fetus in cephalic presentation, underwent conventional labor management with additional translabial sonographic examination. AoP was measured in a total of 95 acquisition sessions, both automatically by an innovative algorithm and manually by an experienced sonographer, who was blinded to the algorithm outcome. The results obtained from the manual measurement were used as the reference against which the performance of the algorithm was assessed. In order to overcome the common difficulties encountered when visualizing by sonography the pubic symphysis, the AoP was measured by considering as the symphysis landmark its centroid rather than its distal point, thereby assuring high measurement reliability and reproducibility, while maintaining objectivity and accuracy in the evaluation of progression of labor. There was a strong and statistically significant correlation between AoP values measured by the algorithm and the reference values (r = 0.99, P < 0.001). The high accuracy provided by the automatic method was also highlighted by the corresponding high values of the coefficient of determination (r 2  = 0.98) and the low residual errors (root mean square error = 2°27' (2.1%)). The global agreement between the two methods, assessed through Bland-Altman analysis, resulted in a negligible mean difference of 1°1' (limits of agreement, 4°29'). The proposed automatic algorithm is a reliable technique for measurement of the AoP. Its (relative) operator-independence has the potential to reduce human errors and speed up ultrasound acquisition time, which should facilitate management of women during labor. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd.

  13. Using novel computer-assisted linguistic analysis techniques to assess the timeliness and impact of FP7 Health’s research – a work in progress report

    Energy Technology Data Exchange (ETDEWEB)

    Stanciauskas, V.; Brozaitis, H.; Manola, N.; Metaxas, O.; Galsworthy, M.

    2016-07-01

    This paper presents the ongoing developments of the ex-post evaluation of the Health theme in FP7 which will be finalised in early 2017. the evaluation was launched by DG Research and Innovation, European Commission. Among other questions the evaluation asked to assess the structuring effect of FP7 Health on the European Research Area dnd the timeliness of the research performed. To this end the evalaution team has applied two innovative computerassisted linguistic analysis techniques to adderss these questions, including dynamic topic modelling and network analysis of co-publications. The topic model built for this evaluation contributed to comprehensive mapping of FP7 Health's research activities and building of a dynamic topic model that has not been attempted in previous evalautions of the Framework Programmes. Our applied network analysiswas of co-publications proved to be a powerful tool in determining the structuring effect of the FP7 Health to a level of detail which was again not implemented in previous evaluations of EU-funded research programmes. (Author)

  14. Prediction of brain tumor progression using a machine learning technique

    Science.gov (United States)

    Shen, Yuzhong; Banerjee, Debrup; Li, Jiang; Chandler, Adam; Shen, Yufei; McKenzie, Frederic D.; Wang, Jihong

    2010-03-01

    A machine learning technique is presented for assessing brain tumor progression by exploring six patients' complete MRI records scanned during their visits in the past two years. There are ten MRI series, including diffusion tensor image (DTI), for each visit. After registering all series to the corresponding DTI scan at the first visit, annotated normal and tumor regions were overlaid. Intensity value of each pixel inside the annotated regions were then extracted across all of the ten MRI series to compose a 10 dimensional vector. Each feature vector falls into one of three categories:normal, tumor, and normal but progressed to tumor at a later time. In this preliminary study, we focused on the trend of brain tumor progression during three consecutive visits, i.e., visit A, B, and C. A machine learning algorithm was trained using the data containing information from visit A to visit B, and the trained model was used to predict tumor progression from visit A to visit C. Preliminary results showed that prediction for brain tumor progression is feasible. An average of 80.9% pixel-wise accuracy was achieved for tumor progression prediction at visit C.

  15. Progressive methods in multiple criteria decision analysis

    OpenAIRE

    Meyer, Patrick

    2007-01-01

    Our work mainly focusses on the study and the development of progressive methods in the field of Multiple Criteria Decision Analysis, i.e., iterative procedures presenting partial conclusions to the Decision Maker that can be refined at further steps of the analysis. The thesis is divided into three parts. The first one is intended to be a general analysis of the concept of progressiveness. The last two parts develop progressive methods related first to Multiattribute Value Theory and sec...

  16. Progress in diagnostic techniques for SC [superconducting] cavities

    International Nuclear Information System (INIS)

    Reece, C.E.

    1988-01-01

    Despite the very real progress that has been made, the routine performance of superconducting cavities still falls far short of both the theoretical expectations and the performance of afew exceptional examples. It is the task of systematically applied diagnostic techniques to reveal additional information concerning the response of superconducting surfaces to applied RF fields. In this paper we will direct our attention to discussions of recent developments in diagnostic techniqeus, such as thermometry in superfluid helium, and scanning laser acoustic microscopy. 18 refs., 12 figs

  17. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  18. IAEA progress report I - Study of archeological objects using PIXE analytical technique

    International Nuclear Information System (INIS)

    Roumie, M.

    2005-01-01

    This is the first IAEA progress report for the period 2005-2006 (CRP number F23023). Since 1999 a particle accelerator, located at the Lebanese Atomic Energy Commission in Beirut, has been devoted to elemental analysis in different domains of application: Archeology, Environment and Material Science. It constitutes the first and unique, till now, ion beam analysis facility in Lebanon. Several nuclear analytical techniques are performed, such as PIXE, RBS, PIGE, ERDA and others. (author)

  19. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  20. Analysis of breast cancer progression using principal component ...

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    We use this set of genes and a new consensus ensemble k-clustering technique, which averages over several clustering methods and many data perturbations, to identify strong, stable clusters. We also define a simple criterion to find the optimum number of. Analysis of breast cancer progression using principal component.

  1. Progress on acoustic techniques for LMFBR structural surveillance

    International Nuclear Information System (INIS)

    Burton, E.J.; Bentley, P.G.; McKnight, J.A.

    1980-01-01

    Acoustic techniques are being developed to monitor remotely the incipient events of various modes of failure. Topics have been selected from the development programme which are either of special importance or in which significant advances have been made recently. Ultrasonic inspection of stainless steel welds is difficult and one alternative approach which is being explored is to identify manufacturing defects during fabrication by monitoring the welding processes. Preliminary measurements are described of the acoustic events measured during deliberately defective welding tests in the laboratory and some initial analysis using pattern recognition techniques is described. The assessment of structural failures using probability analysis has emphasised the potential value of continuous monitoring during operation and this has led to the investigation into the use of vibrational analysis and acoustic emission as monitoring techniques. Mechanical failure from fatigue may be anticipated from measurement of vibrational modes and experience from PFR and from models have indicated the depth of detailed understanding required to achieve this. In the laboratory a vessel with an artificial defect has been pressurised to failure. Detection of the weak stress wave emissions was possible but difficult and the prospects for on-line monitoring are discussed. Ultrasonic technology for providing images of components immersed in the opaque sodium of LMFBRs is being developed. Images are cormed by the physical scanning of a target using transducers in a pulse-echo mode. Lead zirconate transducers have been developed which can be deployed during reactor shut-down. The first application will be to examine a limited area of the core of PFR. Handling the data from such an experiment involves developing methods for reading and storing the information from such ultrasonic echo. Such techniques have been tested in real time by simulation in a water model. Methods of enhancing the images to be

  2. Mass Spectrometric C-14 Detection Techniques: Progress Report

    Science.gov (United States)

    Synal, H.

    2013-12-01

    Accelerator Mass Spectrometry (AMS) has been established as the best-suited radiocarbon detection technique. In the past years, significant progress with AMS instrumentation has been made resulting in a boom of new AMS facilities around the World. Today, carbon only AMS systems predominantly utilize 1+ charge state and molecule destruction in multiple ion gas collisions in stripper gas cell. This has made possible a significant simplification of the instruments, a reduction of ion energies and related to this less required space of the installations. However, state-of-the-art AMS instruments have still not reached a development stage where they can be regarded as table-top systems. In this respect, more development is needed to further advance the applicability of radiocarbon not only in the traditional fields of dating but also in biomedical research and new fields in Earth and environmental sciences. In a the proof-of-principle experiment the feasibility of radiocarbon detection over the entire range of dating applications was demonstrated using a pure mass spectrometer and ion energies below 50 keV. Now an experimental platform has been completed to test performance and to explore operation and measurement conditions of pure mass spectrometric radiocarbon detection. This contribution will overview the physical principles, which make this development possible and discuss key parameters of the instrumental design and performance of such an instrument.

  3. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  4. Probabilistic Accident Progression Analysis with application to a LMFBR design

    International Nuclear Information System (INIS)

    Jamali, K.M.

    1982-01-01

    A method for probabilistic analysis of accident sequences in nuclear power plant systems referred to as ''Probabilistic Accident Progression Analysis'' (PAPA) is described. Distinctive features of PAPA include: (1) definition and analysis of initiator-dependent accident sequences on the component level; (2) a new fault-tree simplification technique; (3) a new technique for assessment of the effect of uncertainties in the failure probabilities in the probabilistic ranking of accident sequences; (4) techniques for quantification of dependent failures of similar components, including an iterative technique for high-population components. The methodology is applied to the Shutdown Heat Removal System (SHRS) of the Clinch River Breeder Reactor Plant during its short-term (0 -2 . Major contributors to this probability are the initiators loss of main feedwater system, loss of offsite power, and normal shutdown

  5. Progress in spatial analysis methods and applications

    CERN Document Server

    Páez, Antonio; Buliung, Ron N; Dall'erba, Sandy

    2010-01-01

    This book brings together developments in spatial analysis techniques, including spatial statistics, econometrics, and spatial visualization, and applications to fields such as regional studies, transportation and land use, population and health.

  6. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods:

  7. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  8. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  9. The latest progress of fission track analysis

    International Nuclear Information System (INIS)

    Wang Shicheng

    1996-01-01

    Fission track analysis as a new nuclear track technique is based on fission track annealing in mineral and is used for oil and gas exploration successfully. The west part of China is the main exploration for oil and gas. The oil and gas basins there experienced much more complicated thermal history and higher paleotemperature. In order to apply fission track analysis to these basins, following work was be carried out: 1. The decomposition of grain age distribution of zircon fission tracks. 2. Study on thermal history of Ordos basin using zircon fission track analysis. 3. The fission track study on the Qiang Tang basin in tibet

  10. Investigation progress of imaging techniques monitoring stem cell therapy

    International Nuclear Information System (INIS)

    Wu Jun; An Rui

    2006-01-01

    Recently stem cell therapy has showed potential clinical application in diabetes mellitus, cardiovascular diseases, malignant tumor and trauma. Efficient techniques of non-invasively monitoring stem cell transplants will accelerate the development of stem cell therapies. This paper briefly reviews the clinical practice of stem cell, in addition, makes a review of monitoring methods including magnetic resonance and radionuclide imaging which have been used in stem cell therapy. (authors)

  11. Integrated proteomic and metabolic analysis of breast cancer progression.

    Directory of Open Access Journals (Sweden)

    Patrick G Shaw

    Full Text Available One of the most persistent hallmarks of cancer biology is the preference of tumor cells to derive energy through glycolysis as opposed to the more efficient process of oxidative phosphorylation (OXPHOS. However, little is known about the molecular cascades by which oncogenic pathways bring about this metabolic switch. We carried out a quantitative proteomic and metabolic analysis of the MCF10A derived cell line model of breast cancer progression that includes parental cells and derivatives representing three different tumor grades of Ras-driven cancer with a common genetic background. A SILAC (Stable Isotope Labeling by Amino acids in Cell culture labeling strategy was used to quantify protein expression in conjunction with subcellular fractionation to measure dynamic subcellular localization in the nucleus, cytosol and mitochondria. Protein expression and localization across cell lines were compared to cellular metabolic rates as a measure of oxidative phosphorylation (OXPHOS, glycolysis and cellular ATP. Investigation of the metabolic capacity of the four cell lines revealed that cellular OXPHOS decreased with breast cancer progression independently of mitochondrial copy number or electron transport chain protein expression. Furthermore, glycolytic lactate secretion did not increase in accordance with cancer progression and decreasing OXPHOS capacity. However, the relative expression and subcellular enrichment of enzymes critical to lactate and pyruvate metabolism supported the observed extracellular acidification profiles. This analysis of metabolic dysfunction in cancer progression integrated with global protein expression and subcellular localization is a novel and useful technique for determining organelle-specific roles of proteins in disease.

  12. Progress in vegetable proteins isolation techniques: A review

    Directory of Open Access Journals (Sweden)

    Hadnađev Miroslav S.

    2017-01-01

    Full Text Available Novel vegetable proteins, like those extracted from abundant raw materials (grass or agri-food by-products and waste streams (oilseed meals, are expected to be used as replacers for animal-derived proteins, due to higher production efficiency, reduced life cycle environmental impact and possibility to meet consumers' dietary or cultural preferences. Although having a versatile functionality (emulsifying, foaming, gelling, texturizing agents, application of proteins is limited since their properties highly depend on their structure and composition, environmental factors (pH, ionic strength, presence of other micro- and macro-molecules in food matrices and isolation method and conditions. The objective of this article is to review the current techniques used to isolate the proteins from vegetable raw materials and comment on the influence of extraction method and conditions (pH, ionic strength, extraction media temperature, extraction time, etc. on protein properties (yield, purity, appearance, solubility, denaturation degree, emulsification efficiency, etc.. The utilization of novel technologies such as ultrasound assisted extraction, electro-activation technique and approaches (enzyme-assisted extraction to improve protein extraction yield or functionality was also discussed.

  13. Algorithms Design Techniques and Analysis

    CERN Document Server

    Alsuwaiyel, M H

    1999-01-01

    Problem solving is an essential part of every scientific discipline. It has two components: (1) problem identification and formulation, and (2) solution of the formulated problem. One can solve a problem on its own using ad hoc techniques or follow those techniques that have produced efficient solutions to similar problems. This requires the understanding of various algorithm design techniques, how and when to use them to formulate solutions and the context appropriate for each of them. This book advocates the study of algorithm design techniques by presenting most of the useful algorithm desi

  14. Environmental Contaminants in Hospital Settings and Progress in Disinfecting Techniques

    Directory of Open Access Journals (Sweden)

    Gabriele Messina

    2013-01-01

    Full Text Available Medical devices, such as stethoscopes, and other objects found in hospital, such as computer keyboards and telephone handsets, may be reservoirs of bacteria for healthcare-associated infections. In this cross-over study involving an Italian teaching hospital we evaluated microbial contamination (total bacterial count (TBC at 36°C/22°C, Staphylococcus spp., moulds, Enterococcus spp., Pseudomonas spp., E. coli, total coliform bacteria, Acinetobacter spp., and Clostridium difficile of these devices before and after cleaning and differences in contamination between hospital units and between stethoscopes and keyboards plus handsets. We analysed 37 telephone handsets, 27 computer keyboards, and 35 stethoscopes, comparing their contamination in four hospital units. Wilcoxon signed-rank and Mann-Whitney tests were used. Before cleaning, many samples were positive for Staphylococcus spp. and coliforms. After cleaning, CFUs decreased to zero in most comparisons. The first aid unit had the highest and intensive care the lowest contamination (P<0.01. Keyboards and handsets had higher TBC at 22°C (P=0.046 and mould contamination (P=0.002 than stethoscopes. Healthcare professionals should disinfect stethoscopes and other possible sources of bacterial healthcare-associated infections. The cleaning technique used was effective in reducing bacterial contamination. Units with high patient turnover, such as first aid, should practise stricter hygiene.

  15. Resonance ionization of sputtered atoms: Progress toward a quantitative technique

    International Nuclear Information System (INIS)

    Calaway, W.F.; Pellin, M.J.; Young, C.E.; Whitten, J.E.; Gruen, D.M.; Coon, S.R.; Texas Univ., Austin, TX; Wiens, R.C.; Burnett, D.S.; Stingeder, G.; Grasserbauer, M.

    1992-01-01

    The combination of RIMS and ion sputtering has been heralded as the ideal means of quantitatively probing the surface of a solid. While several laboratories have demonstrated the extreme sensitivity of combining RIMS with sputtering, less effort has been devoted to the question of accuracy. Using the SARISA instrument developed at Argonne National Laboratory, a number of well-characterized metallic samples have been analyzed. Results from these determinations have been compared with data obtained by several other analytical methods. One significant finding is that impurity measurements down to ppb levels in metal matrices can be made quantitative by employing polycrystalline metal foils as calibration standards. This discovery substantially reduces the effort required for quantitative analysis since a single standard can be used for determining concentrations spanning nine orders of magnitude

  16. [Research Progress of Vitreous Humor Detection Technique on Estimation of Postmortem Interval].

    Science.gov (United States)

    Duan, W C; Lan, L M; Guo, Y D; Zha, L; Yan, J; Ding, Y J; Cai, J F

    2018-02-01

    Estimation of postmortem interval (PMI) plays a crucial role in forensic study and identification work. Because of the unique anatomy location, vitreous humor is considered to be used for estima- ting PMI, which has aroused interest among scholars, and some researches have been carried out. The detection techniques of vitreous humor are constantly developed and improved which have been gradually applied in forensic science, meanwhile, the study of PMI estimation using vitreous humor is updated rapidly. This paper reviews various techniques and instruments applied to vitreous humor detection, such as ion selective electrode, capillary ion analysis, spectroscopy, chromatography, nano-sensing technology, automatic biochemical analyser, flow cytometer, etc., as well as the related research progress on PMI estimation in recent years. In order to provide a research direction for scholars and promote a more accurate and efficient application in PMI estimation by vitreous humor analysis, some inner problems are also analysed in this paper. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  17. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  18. Development of communications analysis techniques

    Science.gov (United States)

    Shelton, R. D.

    1972-01-01

    Major results from the frequency analysis of system program (FASP) are reported. The FASP procedure was designed to analyze or design linear dynamic systems, but can be used to solve any problem that can be described by a system of linear time invariant differential equations. The program also shows plots of performance changes as design parameters are adjusted. Experimental results on narrowband FM distortion are also reported.

  19. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  20. Nonlinear Progressive Collapse Analysis Including Distributed Plasticity

    Directory of Open Access Journals (Sweden)

    Mohamed Osama Ahmed

    2016-01-01

    Full Text Available This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, developed and updated by the U.S. Department of Defense [1]. The AP method is often used for to assess the potential for progressive collapse of building structures that fall under Occupancy Category III or IV. A case study steel building is used to examine the effect of incorporating distributed plasticity, where moment frames were used on perimeter as well as the interior of the three dimensional structural system. It is concluded that the use of moment resisting frames within the structural system will enhance resistance to progressive collapse through ductile deformation response and that it is conserative to ignore the effects of distributed plasticity in determining peak displacement response under the notionally removed column.

  1. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Data analysis techniques for gravitational wave observations. S V Dhurandhar ... The performance of some of these techniques on real data obtained will be discussed. Finally, some results on ... S V Dhurandhar1. Inter-University Centre for Astronomy and Astrophysics, Post Bag 4, Ganeshkhind, Pune 411 007, India ...

  2. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  3. Systems analysis department annual progress report 1986

    International Nuclear Information System (INIS)

    Grohnheit, P.E.; Larsen, H.; Vestergaard, N.K.

    1987-02-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1986. The activities may be classified as energy systems analysis and risk and reliability analysis. The report includes a list of staff members. (author)

  4. Visual Progression Analysis of Student Records Data

    OpenAIRE

    Raji, Mohammad; Duggan, John; DeCotes, Blaise; Huang, Jian; Zanden, Bradley Vander

    2017-01-01

    University curriculum, both on a campus level and on a per-major level, are affected in a complex way by many decisions of many administrators and faculty over time. As universities across the United States share an urgency to significantly improve student success and success retention, there is a pressing need to better understand how the student population is progressing through the curriculum, and how to provide better supporting infrastructure and refine the curriculum for the purpose of ...

  5. Progressive Failure Analysis of Advanced Composites

    Science.gov (United States)

    2008-07-25

    continuum damage mechanics P.P. Camanho a,*, P. Maimı́ b, C.G. Dávila c a DEMEGI, Faculdade de Engenharia, Universidade do Porto , Rua Dr. Roberto...Progressive Damage in Bolted Composite Joints Hannes Koerber, Pedro P. Camanho DEMEGI, Faculdade de Engenharia, Universidade do Porto Rua Dr...Montilivi s/n, Girona, Spain bDEMEGI, Faculdade de Engenharia, Universidade do Porto , Rua Dr. Roberto Frias, 4200-465, Porto , Portugal Abstract This paper

  6. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  7. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    Science.gov (United States)

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.

  8. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  9. PROGRESSIVE COLLAPSE ANALYSIS OF 2-D RC FRAMES USING AEM

    Directory of Open Access Journals (Sweden)

    Osama El-Mahdy

    2017-12-01

    Full Text Available Numerical simulation of a progressive collapse of structures using computer has a very actual apprehension for structural engineers due to their interest in structures veracity estimation. This simulation helps engineers to develop methods for increasing or decreasing the progressive failure. Finite Element Method (FEM is the most computer simulation analysis currently used to perform a structural vulnerability assessment. Unfortunately, FEM is not able to automatically analyze a structure after element separation and collision which has a great effect on a structure’s performance during collapse. For instances, a bombing load can cause damage to a main supporting column in a structure, which will cause debris flying at a very high velocity from the damaged column. This debris can cause another local failure in another column upon impact and lead to the progressive collapse of the whole structure. A new simulation technique, which was developed in 1995 as part of Tagel-Din’s doctoral research, called Applied Element Method (AEM can simulate the structure’s behaviour from zero loading until collapse, through the elastic phase, opening and propagation of cracks, yielding of reinforcement bars and separation and collision of elements. This method is used in Extreme Loading for Structures software (ELS by Applied Science International (ASI. In the current paper, a brief description of the AEM is given. Also, numerical modelling based on two experimental studies available in the literature conducted by Ahmadi et al. and Yi et al. are generated using ELS. These models are used to confirm the capability of AEM in simulation the progressive collapse behaviour of structures. Also, the models are utilized to examine and measure the structural resisting mechanisms of reinforced concrete structures against progressive collapse. The obtained numerical results indicated that, ELS can accurately model all structural behaviour stages up to collapse. A

  10. Single-molecule techniques in biophysics: a review of the progress in methods and applications.

    Science.gov (United States)

    Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J M; Leake, Mark C

    2018-02-01

    Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in 'force spectroscopy' techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including

  11. Single-molecule techniques in biophysics: a review of the progress in methods and applications

    Science.gov (United States)

    Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J. M.; Leake, Mark C.

    2018-02-01

    Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in ‘force spectroscopy’ techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including

  12. [Application progress of minimally invasive technique in treatment of calcaneus fractures].

    Science.gov (United States)

    Yu, Tao; Yang, Yunfeng; Yu, Guangrong

    2013-02-01

    To review the application progress of minimally invasive technique in the treatment of calcaneus fractures and to analyze the advantages and disadvantages of each method as well as to predict the trend of development in the field. Domestic and abroad literature concerning the minimally invasive technique applied in calcaneus fractures in recent years was reviewed extensively and analyzed thoroughly. There are both advantages and limitations of each minimally invasive technique including percutaneous reduction and fixation, limited incision, external fixator, arthroscopic assisted reduction, and balloon expansion reduction. But every technique is developing rapidly and becoming more and more effective. A variety of minimally invasive technique can not only be used independently but also can be applied jointly to complement one another. It needs further study how to improve the effectiveness and expand the indications. And the theoretical basis of evidence-based medicine needs to be provided more.

  13. Risk Analysis Group annual progress report 1984

    International Nuclear Information System (INIS)

    1985-06-01

    The activities of the Risk Analysis Group at Risoe during 1984 are presented. These include descriptions in some detail of work on general development topics and risk analysis performed as contractor. (author)

  14. THEMATIC PROGRESSION PATTERN : A TECHNIQUE TO IMPROVE STUDENTS’ WRITING SKILL VIEWED FROM WRITING APPREHENSION

    Directory of Open Access Journals (Sweden)

    Fitri Nurdianingsih

    2017-10-01

    Full Text Available The objective of conducting this research was to find out : (1 whether or not the use of thematic progression pattern is more effective than direct instruction in teaching writing to the second semester students at English Education Department; (2 the students who have a low writing apprehension have better writing skill than those who have a high writng apprehension; and (3 there is an interaction between teaching technique and writing apprehension in teaching writing skill. This reasearch was an experimental research design. The population of this research was the second semester students at English Education Department of IKIP PGRI Bojonegoro. Meanwhile the sample of this research was selected by using cluster random sampling. The instruments of data collection were witing test and writing apprehension questionnaire. The findings of this study are: (1 thematic progression pattern is more effective than direct instruction in teaching writing; (2 the students who have low writing apprehension have better writing skill than those who have high writing apprehension; and (3 there is an interaction between teaching technique and writing apprehension in teaching writing skill. It can be summarized that thematic progression pattern is an effective technique in teaching writing skill at the second semester students of English Education Department in IKIP PGRI Bojonegoro. The effectiveness of the technique is affected by writing apprehension.

  15. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  16. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  17. Systems Analysis Department annual progress report 1998

    DEFF Research Database (Denmark)

    1999-01-01

    The report describes the work of the Systems Analysis Department at Risø National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, IndustrialSafety and Reliability, Man/Machine Interac....../Machine Interaction, and Technology Scenarios. The report includes lists of publications, lectures, committees and staff members....

  18. Systems Analysis Department annual progress report 1999

    DEFF Research Database (Denmark)

    2000-01-01

    This report describes the work of the Systems Analysis Department at Risø National Laboratory during 1999. The department is undertaking research within Energy Systems Analysis, Energy, Environment and Development Planning - UNEP Centre, Safety,Realiability and Human Factors, and Technology...

  19. Systems Analysis department. Annual progress report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Petersen, Kurt E.

    1998-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1997. The department is undertaking research within Energy systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability and Man/Machine Interaction. The report includes lists of publications lectures, committees and staff members. (au) 110 refs.

  20. Systems Analysis Department annual progress report 1998

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    1999-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability, Man/Machine Interaction and Technology Scenarios. The report includes lists of publications, lectures, committees and staff members. (au) 111 refs.

  1. Systems Analysis Department. Annual Progress Report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    2000-03-01

    This report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1999. The department is undertaking research within Energy Systems Analysis, Energy, Environment and Development Planning-UNEP Centre, Safety, Reliability and Human Factors, and Technology Scenarios. The report includes summary statistics and lists of publications, committees and staff members. (au)

  2. Progress report on the AMT analysis

    International Nuclear Information System (INIS)

    Wood, B.

    1992-01-01

    ICF Resources Incorporated's analysis of the Alternative Minimum Tax (AMT) has examined its effect on the US oil and gas industry from several different perspectives, to estimate the effect of the three relief proposals and to better understand the source of the outcry about the AMTs ''inequities.'' This report is a brief summary of the methodology and results to date. The complexity of the accounting mechanisms that comprise the AMT and the disparity between this analytical conclusions and clauses made by the oil and gas industry (principally the IPAA) have led this analysis through several distinct phases of: Project-level analysis; firm-level analysis; and demographic analysis. These analyses are described in detail

  3. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  4. Progress in nuclear measuring and experimental techniques by application of microelectronics. 1

    International Nuclear Information System (INIS)

    Meiling, W.

    1984-01-01

    In the past decade considerable progress has been made in nuclear measuring and experimental techniques by developing position-sensitive detector systems and widely using integrated circuits and microcomputers for data acquisition and processing as well as for automation of measuring processes. In this report which will be published in three parts those developments are reviewed and demonstrated on selected examples. After briefly characterizing microelectronics, the use of microelectronic elements for radiation detectors is reviewed. (author)

  5. Systems Analysis Department. Annual progress report 1996

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, H.; Olsson, C.; Petersen, K.E. [eds.

    1997-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1996. The department is undertaking research within Simulation and Optimisation of Energy Systems, Energy and Environment in Developing Countries - UNEP Centre, Integrated Environmental and Risk Management and Man/Machine Interaction. The report includes lists of publications, lectures, committees and staff members. (au) 131 refs.

  6. A review of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Mathematical models are utilized to approximate various highly complex engineering, physical, environmental, social, and economic phenomena. Model parameters exerting the most influence on model results are identified through a {open_quotes}sensitivity analysis.{close_quotes} A comprehensive review is presented of more than a dozen sensitivity analysis methods. The most fundamental of sensitivity techniques utilizes partial differentiation whereas the simplest approach requires varying parameter values one-at-a-time. Correlation analysis is used to determine relationships between independent and dependent variables. Regression analysis provides the most comprehensive sensitivity measure and is commonly utilized to build response surfaces that approximate complex models.

  7. Dynamics and vibrations progress in nonlinear analysis

    CERN Document Server

    Kachapi, Seyed Habibollah Hashemi

    2014-01-01

    Dynamical and vibratory systems are basically an application of mathematics and applied sciences to the solution of real world problems. Before being able to solve real world problems, it is necessary to carefully study dynamical and vibratory systems and solve all available problems in case of linear and nonlinear equations using analytical and numerical methods. It is of great importance to study nonlinearity in dynamics and vibration; because almost all applied processes act nonlinearly, and on the other hand, nonlinear analysis of complex systems is one of the most important and complicated tasks, especially in engineering and applied sciences problems. There are probably a handful of books on nonlinear dynamics and vibrations analysis. Some of these books are written at a fundamental level that may not meet ambitious engineering program requirements. Others are specialized in certain fields of oscillatory systems, including modeling and simulations. In this book, we attempt to strike a balance between th...

  8. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  9. Effectiveness of a Daily Class Progress Assessment Technique in Introductory Chemistry

    Science.gov (United States)

    Rogerson, Brian J.

    2003-02-01

    To improve student learning in an introductory chemistry course, a daily class progress assessment was developed. At the end of every class period students answered, in writing, brief questions about material that had just been discussed in class. Student answers were not graded but were always discussed at the beginning of the following class. The intent was to continuously survey all students for their understanding of basic ideas and to correct misconceptions. Student performance during five semesters was examined. The assessment technique was used during two of the semesters. Use of this assessment technique resulted in a significant drop in freshman withdrawal frequencies from 26.7% to 6.7% (p assessment technique improved freshman performance. Contrary to what at first might be believed, the assessment technique is simple and quick. It allowed the immediate identification of difficulties and thus, corrective measures, before students were formally tested. Surveys revealed that students believed the assessments helped them gauge their progress in understanding the material and suggested that such daily feedback should be more widely used.

  10. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  11. Fourier Spectroscopy: A Simple Analysis Technique

    Science.gov (United States)

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  12. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Organic analysis progress report FY 1997

    International Nuclear Information System (INIS)

    Clauss, S.A.; Grant, K.E.; Hoopes, V.; Mong, G.M.; Steele, R.; Bellofatto, D.; Sharma, A.

    1998-04-01

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples

  14. Organic analysis progress report FY 1997

    Energy Technology Data Exchange (ETDEWEB)

    Clauss, S.A.; Grant, K.E.; Hoopes, V.; Mong, G.M.; Steele, R.; Bellofatto, D.; Sharma, A.

    1998-04-01

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples.

  15. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  16. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  17. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  18. Progress of nuclide tracing technique in the study of soil erosion in recent decade

    International Nuclear Information System (INIS)

    Liu Gang; Yang Mingyi; Liu Puling; Tian Junliang

    2007-01-01

    In the last decade nuclide tracing technique has been widely employed in the investigation of soil erosion, which makes the studies of soil erosion into a new and rapid development period. This paper tried to review the recent progress of using 137 Cs, 210 Pb ex , 7 Be, composite tracers and REE-INAA in soil erosion rate, sedimentation rate, sediment source and soil erosion processes study, and also the existing research results. The trends for future development and questions are also discussed. (authors)

  19. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  20. ANALYSIS OF COMPUTER AIDED PROCESS PLANNING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Salim A. Saleh

    2013-05-01

    Full Text Available Computer Aided Process Planning ( CAPP has been recognized as playing a key role in Computer Integrated Manufacturing ( CIM . It was used as a bridge to link CAD with CAM systems, in order to give the possibility of full integration in agreement with computer engineering to introduce CIM. The benefits of CAPP in the real industrial environment are still to be achieved. Due to different manufacturing applications, many different CAPP systems have been developed. The development of CAPP techniques needs to a summarized classification and a descriptive analysis. This paper presents the most important and famous techniques for the available CAPP systems, which are based on the variant, generative or semi-generative methods, and a descriptive analysis of their application possibilities.

  1. Progress on retinal image analysis for age related macular degeneration.

    Science.gov (United States)

    Kanagasingam, Yogesan; Bhuiyan, Alauddin; Abràmoff, Michael D; Smith, R Theodore; Goldschmidt, Leonard; Wong, Tien Y

    2014-01-01

    Age-related macular degeneration (AMD) is the leading cause of vision loss in those over the age of 50 years in the developed countries. The number is expected to increase by ∼1.5 fold over the next ten years due to an increase in aging population. One of the main measures of AMD severity is the analysis of drusen, pigmentary abnormalities, geographic atrophy (GA) and choroidal neovascularization (CNV) from imaging based on color fundus photograph, optical coherence tomography (OCT) and other imaging modalities. Each of these imaging modalities has strengths and weaknesses for extracting individual AMD pathology and different imaging techniques are used in combination for capturing and/or quantification of different pathologies. Current dry AMD treatments cannot cure or reverse vision loss. However, the Age-Related Eye Disease Study (AREDS) showed that specific anti-oxidant vitamin supplementation reduces the risk of progression from intermediate stages (defined as the presence of either many medium-sized drusen or one or more large drusen) to late AMD which allows for preventative strategies in properly identified patients. Thus identification of people with early stage AMD is important to design and implement preventative strategies for late AMD, and determine their cost-effectiveness. A mass screening facility with teleophthalmology or telemedicine in combination with computer-aided analysis for large rural-based communities may identify more individuals suitable for early stage AMD prevention. In this review, we discuss different imaging modalities that are currently being considered or used for screening AMD. In addition, we look into various automated and semi-automated computer-aided grading systems and related retinal image analysis techniques for drusen, geographic atrophy and choroidal neovascularization detection and/or quantification for measurement of AMD severity using these imaging modalities. We also review the existing telemedicine studies which

  2. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  3. Accelerometer Data Analysis and Presentation Techniques

    Science.gov (United States)

    Rogers, Melissa J. B.; Hrovat, Kenneth; McPherson, Kevin; Moskowitz, Milton E.; Reckart, Timothy

    1997-01-01

    The NASA Lewis Research Center's Principal Investigator Microgravity Services project analyzes Orbital Acceleration Research Experiment and Space Acceleration Measurement System data for principal investigators of microgravity experiments. Principal investigators need a thorough understanding of data analysis techniques so that they can request appropriate analyses to best interpret accelerometer data. Accelerometer data sampling and filtering is introduced along with the related topics of resolution and aliasing. Specific information about the Orbital Acceleration Research Experiment and Space Acceleration Measurement System data sampling and filtering is given. Time domain data analysis techniques are discussed and example environment interpretations are made using plots of acceleration versus time, interval average acceleration versus time, interval root-mean-square acceleration versus time, trimmean acceleration versus time, quasi-steady three dimensional histograms, and prediction of quasi-steady levels at different locations. An introduction to Fourier transform theory and windowing is provided along with specific analysis techniques and data interpretations. The frequency domain analyses discussed are power spectral density versus frequency, cumulative root-mean-square acceleration versus frequency, root-mean-square acceleration versus frequency, one-third octave band root-mean-square acceleration versus frequency, and power spectral density versus frequency versus time (spectrogram). Instructions for accessing NASA Lewis Research Center accelerometer data and related information using the internet are provided.

  4. Comparison of analysis techniques for electromyographic data.

    Science.gov (United States)

    Johnson, J C

    1978-01-01

    Electromyography has been effectively employed to estimate the stress encountered by muscles in performing a variety of functions in the static environment. Such analysis provides the basis for modification of a man-machine system in order to optimize the performances of individual tasks by reducing muscle stress. Myriad analysis methods have been proposed and employed to convert raw electromyographic data into numerical indices of stress and, more specifically, muscle work. However, the type of analysis technique applied to the data can significantly affect the outcome of the experiment. In this study, four methods of analysis are employed to simultaneously process electromyographic data from the flexor muscles of the forearm. The methods of analysis include: 1) integrated EMG (three separate time constants), 2) root mean square voltage, 3) peak height discrimination (three level), and 4) turns counting (two methods). Mechanical stress input as applied to the arm of the subjects includes static load and vibration. The results of the study indicate the comparative sensitivity of each of the techniques to changes in EMG resulting from changes in static and dynamic load on the muscle.

  5. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  6. User-Defined Material Model for Progressive Failure Analysis

    Science.gov (United States)

    Knight, Norman F. Jr.; Reeder, James R. (Technical Monitor)

    2006-01-01

    An overview of different types of composite material system architectures and a brief review of progressive failure material modeling methods used for structural analysis including failure initiation and material degradation are presented. Different failure initiation criteria and material degradation models are described that define progressive failure formulations. These progressive failure formulations are implemented in a user-defined material model (or UMAT) for use with the ABAQUS/Standard1 nonlinear finite element analysis tool. The failure initiation criteria include the maximum stress criteria, maximum strain criteria, the Tsai-Wu failure polynomial, and the Hashin criteria. The material degradation model is based on the ply-discounting approach where the local material constitutive coefficients are degraded. Applications and extensions of the progressive failure analysis material model address two-dimensional plate and shell finite elements and three-dimensional solid finite elements. Implementation details and use of the UMAT subroutine are described in the present paper. Parametric studies for composite structures are discussed to illustrate the features of the progressive failure modeling methods that have been implemented.

  7. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  8. Forensic Analysis using Geological and Geochemical Techniques

    Science.gov (United States)

    Hoogewerff, J.

    2009-04-01

    Due to the globalisation of legal (and illegal) trade there is an increasing demand for techniques which can verify the geographical origin and transfer routes of many legal and illegal commodities and products. Although geological techniques have been used in forensic investigations since the emergence of forensics as a science in the late eighteen hundreds, the last decade has seen a marked increase in geo-scientists initiating concept studies using the latest analytical techniques, including studying natural abundance isotope variations, micro analysis with laser ablation ICPMS and geochemical mapping. Most of the concept studies have shown a good potential but uptake by the law enforcement and legal community has been limited due to concerns about the admissibility of the new methods. As an introduction to the UGU2009 session "Forensic Provenancing using Geological and Geochemical Techniques" I will give an overview of the state of the art of forensic geology and the issues that concern the admissibility of geological forensic evidence. I will use examples from the NITECRIME and FIRMS networks, the EU TRACE project and other projects and literature to illustrate the important issues at hand.

  9. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  10. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  11. Managing the Classroom with Technology. On Progress Reports and Online Communications, and How To Manage the Two Different Communication Techniques.

    Science.gov (United States)

    Kasprowicz, Tim

    2002-01-01

    Describes how one teacher bridged the communications gap among teachers, parents, and students through the use of technology in managing his classroom. Discusses progress reports and online communications and how to manage the two different communication techniques. (JOW)

  12. Clustering Analysis within Text Classification Techniques

    Directory of Open Access Journals (Sweden)

    Madalina ZURINI

    2011-01-01

    Full Text Available The paper represents a personal approach upon the main applications of classification which are presented in the area of knowledge based society by means of methods and techniques widely spread in the literature. Text classification is underlined in chapter two where the main techniques used are described, along with an integrated taxonomy. The transition is made through the concept of spatial representation. Having the elementary elements of geometry and the artificial intelligence analysis, spatial representation models are presented. Using a parallel approach, spatial dimension is introduced in the process of classification. The main clustering methods are described in an aggregated taxonomy. For an example, spam and ham words are clustered and spatial represented, when the concepts of spam, ham and common and linkage word are presented and explained in the xOy space representation.

  13. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  14. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  15. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  16. Interferogram analysis using Fourier transform techniques

    Science.gov (United States)

    Roddier, Claude; Roddier, Francois

    1987-01-01

    A method of interferogram analysis is described in which Fourier transform techniques are used to map the complex fringe visibility in several types of interferograms. Algorithms are developed for estimation of both the amplitude and the phase of the fringes (yielding the modulus and the phase of the holographically recorded object Fourier transform). The algorithms were applied to the reduction of interferometric seeing measurements (i.e., the estimation of the fringe amplitude only), and the reduction of interferometric tests (i.e., estimation of the fringe phase only). The method was used to analyze scatter-plate interferograms obtained at NOAO.

  17. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  18. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  19. The effects of progressive muscular relaxation and breathing control technique on blood pressure during pregnancy

    Directory of Open Access Journals (Sweden)

    Mahboobeh Aalami

    2016-01-01

    Full Text Available Background: Hypertensive disorders in pregnancy are the main cause of maternal and fetal mortality; however, they have no definite effective treatment. The researchers aimed to study the effects of progressive muscular relaxation and breathing control technique on blood pressure (BP during pregnancy. Materials and Methods: This three-group clinical trial was conducted in Mashhad health centers and governmental hospitals. Sixty pregnant (after 20 weeks of gestational age women with systolic BP ≥ 135 mmHg or diastolic BP ≥ 85 mmHg were assigned to three groups. Progressive muscular relaxation and breathing control exercises were administered to the two experimental groups once a week in person and in the rest of the days by instructions given on a CD for 4 weeks. BP was checked before and after the interventions. BP was measured before and after 15 min subjects' waiting without any especial intervention in the control group. Results: After 4 weeks of intervention, the systolic (by a mean of 131.3 to 117.2, P = 0.001 and by a mean of 131.05 to 120.5, P = 0.004, respectively and diastolic (by a mean of 79.2 to 72.3, P = 0.001 and by a mean of 80.1 to 76.5, P = 0.047, respectively BPs were significantly decreased in progressive muscular relaxation and breathing control groups, but they were not statistically significant in the control group. Conclusions: The interventions were effective on decreasing systolic and diastolic BP to normal range after 4 weeks in both the groups. The effects of both the interventions were more obvious on systolic BP compared to diastolic BP.

  20. Progress of Space Charge Research on Oil-Paper Insulation Using Pulsed Electroacoustic Techniques

    Directory of Open Access Journals (Sweden)

    Chao Tang

    2016-01-01

    Full Text Available This paper focuses on the space charge behavior in oil-paper insulation systems used in power transformers. It begins with the importance of understanding the space charge behavior in oil-paper insulation systems, followed by the introduction of the pulsed electrostatic technique (PEA. After that, the research progress on the space charge behavior of oil-paper insulation during the recent twenty years is critically reviewed. Some important aspects such as the environmental conditions and the acoustic wave recovery need to be addressed to acquire more accurate space charge measurement results. Some breakthroughs on the space charge behavior of oil-paper insulation materials by the research team at the University of Southampton are presented. Finally, future work on space charge measurement of oil-paper insulation materials is proposed.

  1. Progress in the biosensing techniques for trace-level heavy metals.

    Science.gov (United States)

    Mehta, Jyotsana; Bhardwaj, Sanjeev K; Bhardwaj, Neha; Paul, A K; Kumar, Pawan; Kim, Ki-Hyun; Deep, Akash

    2016-01-01

    Diverse classes of sensors have been developed over the past few decades for on-site detections of heavy metals. Most of these sensor systems have exploited optical, electrochemical, piezoelectric, ion-selective (electrode), and electrochemical measurement techniques. As such, numerous efforts have been made to explore the role of biosensors in the detection of heavy metals based on well-known interactions between heavy metals and biomolecules (e.g. proteins, peptides, enzymes, antibodies, whole cells, and nucleic acids). In this review, we cover the recent progress made on different types of biosensors for the detection of heavy metals. Our major focus was examining the use of biomolecules for constructing these biosensors. The discussion is extended further to cover the biosensors' performance along with challenges and opportunities for practical utilization. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Recent progress in the melt-process technique of high-temperature superconductors

    CERN Document Server

    Ikuta, H; Mizutani, U

    1999-01-01

    Recently, the performance of high-temperature super conductors prepared by the melt-process technique has been greatly improved. This progress was accomplished by the addition of Ag into the starting materials of the Sm-Ba-CuO $9 system, which prevents the formation of severe macro-sized cracks in the finished samples. The magnetic flux density trapped by this material has now reached 9 T at 25 K, which is comparable to the magnetic flux density produced by $9 ordinary superconducting magnets. The amount of magnetic flux density that can be trapped by the sample is limited by the mechanical strength rather than superconducting properties of the material. The increase in the mechanical $9 strength of the material is important both for further improvement of the material properties and for ensuring reliability of the material in practical applications. (20 refs).

  3. Analysis Methods for Progressive Damage of Composite Structures

    Science.gov (United States)

    Rose, Cheryl A.; Davila, Carlos G.; Leone, Frank A.

    2013-01-01

    This document provides an overview of recent accomplishments and lessons learned in the development of general progressive damage analysis methods for predicting the residual strength and life of composite structures. These developments are described within their State-of-the-Art (SoA) context and the associated technology barriers. The emphasis of the authors is on developing these analysis tools for application at the structural level. Hence, modeling of damage progression is undertaken at the mesoscale, where the plies of a laminate are represented as a homogenous orthotropic continuum. The aim of the present effort is establish the ranges of validity of available models, to identify technology barriers, and to establish the foundations of the future investigation efforts. Such are the necessary steps towards accurate and robust simulations that can replace some of the expensive and time-consuming "building block" tests that are currently required for the design and certification of aerospace structures.

  4. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  5. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  6. Closure of round cutaneous defects progressively with the purse string suture technique.

    Science.gov (United States)

    Küçükdurmaz, Fatih; Agir, Ismail; Gümüstas, Seyitali; Kivilcim, Hakan; Tetik, Cihangir

    2015-01-01

    There are many closure techniques available to cutaneous surgeons. One of them is the purse-string suture which is used to provide complete or partial closure of round skin defects. In our animal study; we closed skin defects with using subcuticular purse string suture technique by progressively cinching wound and we aim to more rapidly healing according to secondary healing. After anaesthetize, we created a 4 cm diameter circular full thickness skin defect on dorsal area of rats. In group 1, subcuticular purse string suture was applied by using a nonabsorbable and monofilament suture and a sliding arthroscopic knot was applied to both ends. Arthroscopic suture was shift 1 cm forward every day. In group 2 skin defect was leaved open and daily dressing was made and in both group defect diameters were measured every day and noted. The skin defects were closed totally after 15 days in group 1 but in group 2 defects were reduced but still had a mean 1,5-cm diameter sircular defect. Closing large circular wound with purse string suture and gradual tightening decreases the healing time and expand the skin tissue without using any tissue expander.

  7. Data Analysis Techniques for Ligo Detector Characterization

    Science.gov (United States)

    Valdes Sanchez, Guillermo A.

    Gravitational-wave astronomy is a branch of astronomy which aims to use gravitational waves to collect observational data about astronomical objects and events such as black holes, neutron stars, supernovae, and processes including those of the early universe shortly after the Big Bang. Einstein first predicted gravitational waves in the early century XX, but it was not until Septem- ber 14, 2015, that the Laser Interferometer Gravitational-Wave Observatory (LIGO) directly ob- served the first gravitational waves in history. LIGO consists of two twin detectors, one in Livingston, Louisiana and another in Hanford, Washington. Instrumental and sporadic noises limit the sensitivity of the detectors. Scientists conduct Data Quality studies to distinguish a gravitational-wave signal from the noise, and new techniques are continuously developed to identify, mitigate, and veto unwanted noise. This work presents the application of data analysis techniques, such as Hilbert-Huang trans- form (HHT) and Kalman filtering (KF), in LIGO detector characterization. We investigated the application of HHT to characterize the gravitational-wave signal of the first detection, we also demonstrated the functionality of HHT identifying noise originated from light being scattered by perturbed surfaces, and we estimated thermo-optical aberration using KF. We put particular attention to the scattering origin application, for which a tool was developed to identify disturbed surfaces originating scattering noise. The results reduced considerably the time to search for the scattering surface and helped LIGO commissioners to mitigate the noise.

  8. Gene expression analysis of relapsing– remitting, primary progressive and secondary progressive multiple sclerosis

    DEFF Research Database (Denmark)

    Ratzer, R; Søndergaard, Helle Bach; Christensen, Jeppe Romme

    2013-01-01

    Previous studies of multiple sclerosis (MS) have indicated differences in the pathogenesis in relapsing-remitting (RRMS), secondary progressive (SPMS) and primary progressive (PPMS) disease.......Previous studies of multiple sclerosis (MS) have indicated differences in the pathogenesis in relapsing-remitting (RRMS), secondary progressive (SPMS) and primary progressive (PPMS) disease....

  9. Chromatographic screening techniques in systematic toxicological analysis.

    Science.gov (United States)

    Drummer, O H

    1999-10-15

    A review of techniques used to screen biological specimens for the presence of drugs was conducted with particular reference to systematic toxicological analysis. Extraction systems of both the liquid-liquid and solid-phase type show little apparent difference in their relative ability to extract a range of drugs according to their physio-chemical properties, although mixed-phase SPE extraction is a preferred technique for GC-based applications, and liquid-liquid were preferred for HPLC-based applications. No one chromatographic system has been shown to be capable of detecting a full range of common drugs of abuse, and common ethical drugs, hence two or more assays are required for laboratories wishing to cover a reasonably comprehensive range of drugs of toxicological significance. While immunoassays are invariably used to screen for drugs of abuse, chromatographic systems relying on derivatization and capable of extracting both acidic and basic drugs would be capable of screening a limited range of targeted drugs. Drugs most difficult to detect in systematic toxicological analysis include LSD, psilocin, THC and its metabolites, fentanyl and its designer derivatives, some potent opiates, potent benzodiazepines and some potent neuroleptics, many of the newer anti-convulsants, alkaloids colchicine, amantins, aflatoxins, antineoplastics, coumarin-based anti-coagulants, and a number of cardiovascular drugs. The widespread use of LC-MS and LC-MS-MS for specific drug detection and the emergence of capillary electrophoresis linked to MS and MS-MS provide an exciting possibility for the future to increase the range of drugs detected in any one chromatographic screening system.

  10. [Clinical comparative analysis for pulmonary histoplasmosis and progressive disseminated histoplasmosis].

    Science.gov (United States)

    Zhang, Yan; Su, Xiaoli; Li, Yuanyuan; He, Ruoxi; Hu, Chengping; Pan, Pinhua

    2016-12-28

    To compare clinical features, diagnosis and therapeutic effect between pulmonary histoplasmosis and progressive disseminated histoplasmosis.
 Methods: A retrospective analysis for 12 cases of hospitalized patients with histoplasmosis, who was admitted in Xiangya Hospital, Central South University during the time from February 2009 to October 2015, was carried out. Four cases of pulmonary histoplasmosis and 8 cases of progressive disseminated histoplasmosis were included. The differences of clinical features, imaging tests, means for diagnosis and prognosis were analyzed between the two types of histoplasmosis.
 Results: The clinical manifestations of pulmonary histoplasmosis were mild, such as dry cough. However, the main clinical symptoms of progressive disseminated histoplasmosis were severe, including recurrence of high fever, superficial lymph node enlargement over the whole body, hepatosplenomegaly, accompanied by cough, abdominal pain, joint pain, skin changes, etc.Laboratory examination showed pancytopenia, abnormal liver function and abnormal coagulation function. One pulmonary case received the operation of left lower lung lobectomy, 3 cases of pulmonary histoplasmosis and 6 cases of progressive disseminated histoplasmosis patients were given deoxycholate amphotericin B, itraconazole, voriconazole or fluconazole for antifungal therapy. One disseminated case discharged from the hospital without treatment after diagnosis of histoplasmosis, and 1 disseminated case combined with severe pneumonia and active tuberculosis died ultimately.
 Conclusion: As a rare fungal infection, histoplasmosis is easily to be misdiagnosed. The diagnostic criteria depends on etiology through bone marrow smear and tissues biopsy. Liposomeal amphotericin B, deoxycholate amphotericin B and itraconazole are recommended to treat infection for histoplasma capsulatum.

  11. [Research progresses of anabolic steroids analysis in doping control].

    Science.gov (United States)

    Long, Yuanyuan; Wang, Dingzhong; Li, Ke'an; Liu, Feng

    2008-07-01

    Anabolic steroids, a kind of physiological active substance, are widely abused to improve athletic performance in human sports. They have been forbidden in sports by the International Olympic Committee since 1983. Since then, many researchers have been focusing their attentions on the establishment of reliable detection methods. In this paper, we review the research progresses of different analytical methods for anabolic steroids since 2002, such as gas chromatography-mass spectrometry, liquid chromatography-mass spectrometry, immunoassay, electrochemistry analysis and mass spectrometry. The developing prospect of anabolic steroids analysis is also discussed.

  12. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  13. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  14. Extracellular Vesicle Heterogeneity: Subpopulations, Isolation Techniques, and Diverse Functions in Cancer Progression

    Directory of Open Access Journals (Sweden)

    Eduard Willms

    2018-04-01

    Full Text Available Cells release membrane enclosed nano-sized vesicles termed extracellular vesicles (EVs that function as mediators of intercellular communication by transferring biological information between cells. Tumor-derived EVs have emerged as important mediators in cancer development and progression, mainly through transfer of their bioactive content which can include oncoproteins, oncogenes, chemokine receptors, as well as soluble factors, transcripts of proteins and miRNAs involved in angiogenesis or inflammation. This transfer has been shown to influence the metastatic behavior of primary tumors. Moreover, tumor-derived EVs have been shown to influence distant cellular niches, establishing favorable microenvironments that support growth of disseminated cancer cells upon their arrival at these pre-metastatic niches. It is generally accepted that cells release a number of major EV populations with distinct biophysical properties and biological functions. Exosomes, microvesicles, and apoptotic bodies are EV populations most widely studied and characterized. They are discriminated based primarily on their intracellular origin. However, increasing evidence suggests that even within these EV populations various subpopulations may exist. This heterogeneity introduces an extra level of complexity in the study of EV biology and function. For example, EV subpopulations could have unique roles in the intricate biological processes underlying cancer biology. Here, we discuss current knowledge regarding the role of subpopulations of EVs in cancer development and progression and highlight the relevance of EV heterogeneity. The position of tetraspanins and integrins therein will be highlighted. Since addressing EV heterogeneity has become essential for the EV field, current and novel techniques for isolating EV subpopulations will also be discussed. Further dissection of EV heterogeneity will advance our understanding of the critical roles of EVs in health and

  15. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    Neergaard, Helle; Leitch, Claire

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  16. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  17. Visual Evaluation Techniques for Skill Analysis.

    Science.gov (United States)

    Brown, Eugene W.

    1982-01-01

    Visual evaluation techniques provide the kinesiologist with a method of evaluating physical skill performance. The techniques are divided into five categories: (1) vantage point; (2) movement simplification; (3) balance and stability; (4) movement relationships; and (5) range of movement. (JN)

  18. Detection of progression of glaucomatous retinal nerve fibre layer defects using optical coherence tomography-guided progression analysis.

    Science.gov (United States)

    Hwang, Young Hoon; Kim, Min Kyung; Wi, Jae Min; Chung, Jae Keun; Lee, Kwan Bok

    2018-01-01

    The aim was to investigate the agreement for detection of progression of glaucomatous retinal nerve fibre layer defects (RNFLD) between optical coherence tomography-guided progression analysis (OCT GPA) and conventional red-free fundus photography. Four hundred and fifteen glaucomatous eyes that underwent at least four serial red-free photographic and OCT examinations were included in the study. Based on the inspection of the red-free fundus photographs and GPA maps, RNFLD progression was defined as the development of a new defect, widening or deepening of a pre-existing RNFLD in red-free fundus photography (photographic progression) or 'Likely Loss' on a GPA map (GPA progression). The agreement of photographic and OCT GPA progression and the factors influencing it, including refractive error, severity of glaucoma (mean deviation of the visual field), type of RNFLD (localised versus diffuse), width of the baseline RNFLD, type of RNFLD progression (new defect, widening, deepening) and location of RNFLD progression (clock-hour sector) were assessed. Among the 415 eyes, 82 (19.8 per cent) showed photographic or GPA progression. Among the 82 eyes with progression, progression was detected only in red-free fundus photography in nine (11.0 per cent) eyes and only in GPA in 32 (39.0 per cent) eyes. In 41 eyes (50.0 per cent), progression was detected with both methods. Detection of RNFLD progression only in GPA was associated with a higher myopia, diffuse RNFLD, deepening of the RNFLD and RNFLD progression at the 6, 9 and 12 o'clock positions (p < 0.05). OCT GPA may be a useful supplement to conventional red-free fundus photography for detecting RNFLD progression. © 2017 Optometry Australia.

  19. Progress Report On Techniques Deriving Land Cover And Earth Surface Deformation Information From Polarimetric SAR Interferometry

    Science.gov (United States)

    Pottier, E.; Chen, E.; Li, Z.; Hong, W.; Xiang, M.; Cloude, S. R.; Papathanassiou, K.; Cao, F.; Zhang, H.

    2010-10-01

    In this paper we provide an up-date of activities carried out under the DRAGON collaborative program in a project concerned with the application of Pol-InSAR to deriving land cover and Earth Surface deformation information. This project (ID. 5344) is based around four main scientific topics: Land Cover Analysis, Earth Surface Deformation Monitoring and DEM Extraction, Forest V ertical Structure Parameters Extraction and PolSARpro Software Development. We propose a brief summary of the project objectives and progress to date of each Work Packages, concentrating on different recent developments, original results and important highlights that have been presented during the Dragon2 Mid-Term Results Symposium, that was held on 17-21 May 2010, in Yangshuo, Guilin, P.R. China

  20. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  1. Progress report on a fast, particle-identifying trigger based on ring-imaging Cherenkov techniques

    International Nuclear Information System (INIS)

    Carroll, J.; Igo, G.; Jacobs, P.; Matis, H.; Naudet, C.; Schroeder, L.S.; Seidl, P.A.; Hallman, T.J.

    1990-01-01

    Experiments which require a large sample of relatively rare events need an efficient (low dead time) trigger that does more than select central collisions. The authors propose to develop a trigger that will permit sophisticated multi-particle identification on a time scale appropriate for the interaction rates expected at RHIC. The visible component of the ring-image produced by an appropriate Cherenkov-radiator-mirror combination is focused onto an array of fast photo-detectors. The output of the photo-array is coupled to a fast pattern recognition system that will identify events containing particles of specified types and angular configurations. As a parallel effort, they propose to develop a spectrum-splitting mirror that will permit the ring-image from a single radiator to be used both in this trigger (the visible component of the image) and in a TMAE containing gas detector (the UV component). The gas detector will provide higher resolution information on particle ID and direction with a delay of a few microseconds. This technique will enable nearly optimal use of the information contained in the Cherenkov spectrum. The authors report progress on the three goals set forth in the proposal: 1. the development of a fast photo-array; 2. the development of a spectrum splitting mirror; and 3. the development and simulation of fast parallel algorithms for ring finding

  2. Recent progress in genome engineering techniques in the silkworm, Bombyx mori.

    Science.gov (United States)

    Daimon, Takaaki; Kiuchi, Takashi; Takasu, Yoko

    2014-01-01

    Rapid advances in genome engineering tools, such as zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and the clustered regularly interspaced palindromic repeats/CRISPR-associated (CRISPR/Cas) system, have enabled efficient gene knockout experiments in a wide variety of organisms. Here, we review the recent progress in targeted gene disruption techniques in the silkworm, Bombyx mori. Although efficiency of targeted mutagenesis was very low in an early experiment using ZFNs, recent studies have shown that TALENs can induce highly efficient mutagenesis of desired target genes in Bombyx. Notably, mutation frequencies induced by TALENs can reach more than 50% of G0 gametes. Thus, TALENs can now be used as a standard tool for gene targeting studies, even when mutant phenotypes are unknown. We also propose guidelines for experimental design and strategy for knockout experiments in Bombyx. Genome editing technologies will greatly increase the usefulness of Bombyx as a model for lepidopteran insects, the major agricultural pests, and lead to sophisticated breeding of Bombyx for use in sericulture and biotechnology. © 2013 The Authors Development, Growth & Differentiation © 2013 Japanese Society of Developmental Biologists.

  3. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    Energy Technology Data Exchange (ETDEWEB)

    Mirizzi, F. [Associazione EURATOM-ENEA sulla Fusione, Consorzio CREATE, Università degli Studi di Napoli Federico II, Via Claudio 21, 80125, Napoli (Italy)

    2014-02-12

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper.

  4. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    International Nuclear Information System (INIS)

    Mirizzi, F.

    2014-01-01

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper

  5. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    Science.gov (United States)

    Mirizzi, F.

    2014-02-01

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper.

  6. Nano-Aptasensing in Mycotoxin Analysis: Recent Updates and Progress

    Directory of Open Access Journals (Sweden)

    Amina Rhouati

    2017-10-01

    Full Text Available Recent years have witnessed an overwhelming integration of nanomaterials in the fabrication of biosensors. Nanomaterials have been incorporated with the objective to achieve better analytical figures of merit in terms of limit of detection, linear range, assays stability, low production cost, etc. Nanomaterials can act as immobilization support, signal amplifier, mediator and artificial enzyme label in the construction of aptasensors. We aim in this work to review the recent progress in mycotoxin analysis. This review emphasizes on the function of the different nanomaterials in aptasensors architecture. We subsequently relate their features to the analytical performance of the given aptasensor towards mycotoxins monitoring. In the same context, a critically analysis and level of success for each nano-aptasensing design will be discussed. Finally, current challenges in nano-aptasensing design for mycotoxin analysis will be highlighted.

  7. Hurdles run technique analysis in the 400m hurdles

    OpenAIRE

    Drtina, Martin

    2010-01-01

    Hurdles run technique analysis in the 400m hurdles Thesis objectives: The main objective is to compare the technique hurdles run in the race tempo on the track 400 m hurdles at the selected probands. Tasks are identified kinematic parameters separately for each proband and identify their weaknesses in technique. Method: Analysis techniques hurdles run was done by using 3D kinematic analysis. Observed space-time events were recorded on two digital cameras. Records was transferred to a suitable...

  8. Computer simulation, nuclear techniques and surface analysis

    Directory of Open Access Journals (Sweden)

    Reis, A. D.

    2010-02-01

    Full Text Available This article is about computer simulation and surface analysis by nuclear techniques, which are non-destructive. The “energy method of analysis” for nuclear reactions is used. Energy spectra are computer simulated and compared with experimental data, giving target composition and concentration profile information. Details of prediction stages are given for thick flat target yields. Predictions are made for non-flat targets having asymmetric triangular surface contours. The method is successfully applied to depth profiling of 12C and 18O nuclei in thick targets, by deuteron (d,p and proton (p,α induced reactions, respectively.

    Este artículo trata de simulación por ordenador y del análisis de superficies mediante técnicas nucleares, que son no destructivas. Se usa el “método de análisis en energía” para reacciones nucleares. Se simulan en ordenador espectros en energía que se comparan con datos experimentales, de lo que resulta la obtención de información sobre la composición y los perfiles de concentración de la muestra. Se dan detalles de las etapas de las predicciones de espectros para muestras espesas y planas. Se hacen predicciones para muestras no planas que tienen contornos superficiales triangulares asimétricos. Este método se aplica con éxito en el cálculo de perfiles en profundidad de núcleos de 12C y de 18O en muestras espesas a través de reacciones (d,p y (p,α inducidas por deuterones y protones, respectivamente.

  9. Cochlear implant simulator for surgical technique analysis

    Science.gov (United States)

    Turok, Rebecca L.; Labadie, Robert F.; Wanna, George B.; Dawant, Benoit M.; Noble, Jack H.

    2014-03-01

    Cochlear Implant (CI) surgery is a procedure in which an electrode array is inserted into the cochlea. The electrode array is used to stimulate auditory nerve fibers and restore hearing for people with severe to profound hearing loss. The primary goals when placing the electrode array are to fully insert the array into the cochlea while minimizing trauma to the cochlea. Studying the relationship between surgical outcome and various surgical techniques has been difficult since trauma and electrode placement are generally unknown without histology. Our group has created a CI placement simulator that combines an interactive 3D visualization environment with a haptic-feedback-enabled controller. Surgical techniques and patient anatomy can be varied between simulations so that outcomes can be studied under varied conditions. With this system, we envision that through numerous trials we will be able to statistically analyze how outcomes relate to surgical techniques. As a first test of this system, in this work, we have designed an experiment in which we compare the spatial distribution of forces imparted to the cochlea in the array insertion procedure when using two different but commonly used surgical techniques for cochlear access, called round window and cochleostomy access. Our results suggest that CIs implanted using round window access may cause less trauma to deeper intracochlear structures than cochleostomy techniques. This result is of interest because it challenges traditional thinking in the otological community but might offer an explanation for recent anecdotal evidence that suggests that round window access techniques lead to better outcomes.

  10. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    Abstract. Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  11. Posttraumatic progressive cubitus varus deformity managed by lateral column shortening: A novel surgical technique.

    Science.gov (United States)

    Srivastava, Amit; Jain, Anil-Kumar; Dhammi, Ish Kumar; Haq, Rehan-Ul

    2016-08-01

    The outward angulation of elbow with supinated forearm is cubitus varus deformity. This deformity is often seen as sequelae of malunited supracondylar fracture of humerus in paediatric age group of 5e8 years. The deformity is usually non-progressive, but in cases of physeal injury or congenital bony bar formation in the medial condyle of humerus, the deformity is progressive and can be grotesque in appearance. Various types of osteotomies are defined for standard non-progressive cubitus varus deformity, while multiple surgeries are required for progressive deformity until skeletal maturity. In this study we described a novel surgical approach and osteotomy of distal humerus in a 5 years old boy having grotesque progressive cubitus varus deformity, achieving good surgical outcome.

  12. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  13. Systematic Epigenomic Analysis Reveals Chromatin States Associated with Melanoma Progression.

    Science.gov (United States)

    Fiziev, Petko; Akdemir, Kadir C; Miller, John P; Keung, Emily Z; Samant, Neha S; Sharma, Sneha; Natale, Christopher A; Terranova, Christopher J; Maitituoheti, Mayinuer; Amin, Samirkumar B; Martinez-Ledesma, Emmanuel; Dhamdhere, Mayura; Axelrad, Jacob B; Shah, Amiksha; Cheng, Christine S; Mahadeshwar, Harshad; Seth, Sahil; Barton, Michelle C; Protopopov, Alexei; Tsai, Kenneth Y; Davies, Michael A; Garcia, Benjamin A; Amit, Ido; Chin, Lynda; Ernst, Jason; Rai, Kunal

    2017-04-25

    The extent and nature of epigenomic changes associated with melanoma progression is poorly understood. Through systematic epigenomic profiling of 35 epigenetic modifications and transcriptomic analysis, we define chromatin state changes associated with melanomagenesis by using a cell phenotypic model of non-tumorigenic and tumorigenic states. Computation of specific chromatin state transitions showed loss of histone acetylations and H3K4me2/3 on regulatory regions proximal to specific cancer-regulatory genes in important melanoma-driving cell signaling pathways. Importantly, such acetylation changes were also observed between benign nevi and malignant melanoma human tissues. Intriguingly, only a small fraction of chromatin state transitions correlated with expected changes in gene expression patterns. Restoration of acetylation levels on deacetylated loci by histone deacetylase (HDAC) inhibitors selectively blocked excessive proliferation in tumorigenic cells and human melanoma cells, suggesting functional roles of observed chromatin state transitions in driving hyperproliferative phenotype. Through these results, we define functionally relevant chromatin states associated with melanoma progression. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  14. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    R.Narasimhan(krishtel emaging) 1461 1996 Oct 15 13:05:22

    the amplitude of H decreases progressively with increasing latitudes at the Indian chain of observa- tories (Rastogi et al 1997). The aim of this study is to apply the method of multidimensional scal- ing technique to examine the accuracy of results in comparison with the conventional method of cor- relation coefficients in the ...

  15. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  16. Defect analysis program for LOFT. Progress report, 1977

    International Nuclear Information System (INIS)

    Doyle, R.E.; Scoonover, T.M.

    1978-03-01

    In order to alleviate problems encountered while performing previous defect analyses on components of the LOFT system, regions of LOFT most likely to require defect analysis have been identified. A review of available documentation has been conducted to identify shapes, sizes, materials, and welding procedures and to compile mechanical property data. The LOFT Reactor Vessel Material Surveillance Program has also been reviewed, and a survey of available literature describing existing techniques for conducting elastic-plastic defect analysis was initiated. While large amounts of mechanical property data were obtained from the available documentation and the literature, much information was not available, especially for weld heat-affected zones. Therefore, a program of mechanical property testing is recommended for FY-78 as well as continued literature search. It is also recommended that fatigue-crack growth-rate data be sought from the literature and that evaluation of the various techniques of elastic-plastic defect analysis be continued. Review of additional regions of the LOFT system in the context of potential defect analysis will be conducted as time permits

  17. Recent trends in particle size analysis techniques

    Science.gov (United States)

    Kang, S. H.

    1984-01-01

    Recent advances and developments in the particle-sizing technologies are briefly reviewed in accordance with three operating principles including particle size and shape descriptions. Significant trends of the particle size analysing equipment recently developed show that compact electronic circuitry and rapid data processing systems were mainly adopted in the instrument design. Some newly developed techniques characterizing the particulate system were also introduced.

  18. Inverse Filtering Techniques in Speech Analysis | Nwachuku ...

    African Journals Online (AJOL)

    inverse filtering' has been applied. The unifying features of these techniques are presented, namely: 1. a basis in the source-filter theory of speech production, 2. the use of a network whose transfer function is the inverse of the transfer function of ...

  19. Adhesive Characterization and Progressive Damage Analysis of Bonded Composite Joints

    Science.gov (United States)

    Girolamo, Donato; Davila, Carlos G.; Leone, Frank A.; Lin, Shih-Yung

    2014-01-01

    The results of an experimental/numerical campaign aimed to develop progressive damage analysis (PDA) tools for predicting the strength of a composite bonded joint under tensile loads are presented. The PDA is based on continuum damage mechanics (CDM) to account for intralaminar damage, and cohesive laws to account for interlaminar and adhesive damage. The adhesive response is characterized using standard fracture specimens and digital image correlation (DIC). The displacement fields measured by DIC are used to calculate the J-integrals, from which the associated cohesive laws of the structural adhesive can be derived. A finite element model of a sandwich conventional splice joint (CSJ) under tensile loads was developed. The simulations indicate that the model is capable of predicting the interactions of damage modes that lead to the failure of the joint.

  20. Progressive Damage and Failure Analysis of Composite Laminates

    Science.gov (United States)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis

  1. TU-EF-BRD-02: Indicators and Technique Analysis

    International Nuclear Information System (INIS)

    Carlone, M.

    2015-01-01

    peer-reviewed research will be used to highlight the main points. Historical, medical physicists have leveraged many areas of applied physics, engineering and biology to improve radiotherapy. Research on quality and safety is another area where physicists can have an impact. The key to further progress is to clearly define what constitutes quality and safety research for those interested in doing such research and the reviewers of that research. Learning Objectives: List several tools of quality and safety with references to peer-reviewed literature. Describe effects of mental workload on performance. Outline research in quality and safety indicators and technique analysis. Understand what quality and safety research needs to be going forward. Understand the links between cooperative group trials and quality and safety research

  2. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  3. An analysis of induction motor testing techniques

    International Nuclear Information System (INIS)

    Soergel, S.

    1996-01-01

    There are two main failure mechanisms in induction motors: bearing related and stator related. The Electric Power Research Institute (EPRI) conducted a study which was completed in 1985, and found that near 37% of all failures were attributed to stator problems. Another data source for motor failures is the Nuclear Plant Reliability Data System (NPRDS). This database reveals that approximately 55% of all motors were identified as being degraded before failure occurred. Of these, approximately 35% were due to electrical faults. These are the faults which this paper will attempt to identify through testing techniques. This paper is a discussion of the current techniques used to predict incipient failure of induction motors. In the past, the main tests were those to assess the integrity of the ground insulation. However, most insulation failures are believed to involve turn or strand insulation, which makes traditional tests alone inadequate for condition assessment. Furthermore, these tests have several limitations which need consideration when interpreting the results. This paper will concentrate on predictive maintenance techniques which detect electrical problems. It will present appropriate methods and tests, and discuss the strengths and weaknesses of each

  4. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; hide

    2015-01-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an 55Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  5. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  6. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-01-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches. PMID:18496587

  7. A Dimensionality Reduction Technique for Efficient Time Series Similarity Analysis.

    Science.gov (United States)

    Wang, Qiang; Megalooikonomou, Vasileios

    2008-03-01

    We propose a dimensionality reduction technique for time series analysis that significantly improves the efficiency and accuracy of similarity searches. In contrast to piecewise constant approximation (PCA) techniques that approximate each time series with constant value segments, the proposed method--Piecewise Vector Quantized Approximation--uses the closest (based on a distance measure) codeword from a codebook of key-sequences to represent each segment. The new representation is symbolic and it allows for the application of text-based retrieval techniques into time series similarity analysis. Experiments on real and simulated datasets show that the proposed technique generally outperforms PCA techniques in clustering and similarity searches.

  8. Trend Filtering Techniques for Time Series Analysis

    OpenAIRE

    López Arias, Daniel

    2016-01-01

    Time series can be found almost everywhere in our lives and because of this being capable of analysing them is an important task. Most of the time series we can think of are quite noisy, being this one of the main problems to extract information from them. In this work we use Trend Filtering techniques to try to remove this noise from a series and understand the underlying trend of the series, that gives us information about the behaviour of the series aside from the particular...

  9. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  10. Dynamic speckle analysis using multivariate techniques

    International Nuclear Information System (INIS)

    López-Alonso, José M; Alda, Javier; Rabal, Héctor; Grumel, Eduardo; Trivi, Marcelo

    2015-01-01

    In this work we use principal components analysis to characterize dynamic speckle patterns. This analysis quantitatively identifies different dynamics that could be associated to physical phenomena occurring in the sample. We also found the contribution explained by each principal component, or by a group of them. The method analyzes the paint drying process over a hidden topography. It can be used for fast screening and identification of different dynamics in biological or industrial samples by means of dynamic speckle interferometry. (paper)

  11. Advanced Imaging Techniques for Multiphase Flows Analysis

    Science.gov (United States)

    Amoresano, A.; Langella, G.; Di Santo, M.; Iodice, P.

    2017-08-01

    Advanced numerical techniques, such as fuzzy logic and neural networks have been applied in this work to digital images acquired on two applications, a centrifugal pump and a stationary spray in order to define, in a stochastic way, the gas-liquid interface evolution. Starting from the numeric matrix representing the image it is possible to characterize geometrical parameters and the time evolution of the jet. The algorithm used works with the fuzzy logic concept to binarize the chromatist of the pixels, depending them, by using the difference of the light scattering for the gas and the liquid phase.. Starting from a primary fixed threshold, the applied technique, can select the ‘gas’ pixel from the ‘liquid’ pixel and so it is possible define the first most probably boundary lines of the spray. Acquiring continuously the images, fixing a frame rate, a most fine threshold can be select and, at the limit, the most probably geometrical parameters of the jet can be detected.

  12. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  13. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  14. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  15. 48 CFR 215.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for...

  16. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... ensure a fair and reasonable price. Examples of such techniques include, but are not limited to, the... to the cost or price analysis of the service or product being proposed should also be included in the... techniques. (a) General. The objective of proposal analysis is to ensure that the final agreed-to price is...

  17. Recent Progress in Application of Internal Oxidation Technique in Nb3Sn Strands

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Xingchen [Fermilab; Peng, Xuan [Hyper Tech Research Inc.; Sumption, Michael [Ohio State U.; Collings, E. W. [Ohio State U.

    2016-10-13

    The internal oxidation technique can generate ZrO2 nano particles in Nb3Sn strands, which markedly refine the Nb3Sn grain size and boost the high-field critical current density (Jc). This article summarizes recent efforts on implementing this technique in practical Nb3Sn wires and adding Ti as a dopant. It is demonstrated that this technique can be readily incorporated into the present Nb3Sn conductor manufacturing technology. Powder-in-tube (PIT) strands with fine subelements (~25 µm) based on this technique were successfully fabricated, and proper heat treatments for oxygen transfer were explored. Future work for producing strands ready for applications is proposed.

  18. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  19. Combining the Use of Progressive Writing Techniques and Popular Movies in Introductory Psychology.

    Science.gov (United States)

    Hemenover, Scott H.; Caster, Jeffrey B.; Mizumoto, Ayumi

    1999-01-01

    Examines whether the use of progressive writing for a psychology paper assignment affects students' writing and motivation when used to discuss course material illustrated in popular movies. Reveals that the students felt their writing improved and 44% of the students earned 90% of the overall points; student motivation was lower than expected.…

  20. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  1. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    features in the speech process: (i) the resonant structure of the vocal-tract transfer function, i.e, formant analysis,. (ii) the glottal wave,. (iii) the fundamental frequency or pitch of the sound. During the production of speech, the configuration of the articulators: the vocal tract tongue, teeth, lips, etc, changes from one sound to.

  2. Microstructure analysis using SAXS/USAXS techniques

    International Nuclear Information System (INIS)

    Okuda, Hiroshi; Ochiai, Shojiro

    2010-01-01

    Introduction to small-angle X-ray scattering (SAXS) and ultra small-angle X-ray scattering (USAXS) is presented. SAXS is useful for microstructure analysis of age-hardenable alloys containing precipitates with several to several tens of nanometers in size. On the other hand, USAXS is appropriate to examine much larger microstructural heterogeneities, such as inclusions, voids, and large precipitates whose size is typically around one micrometer. Combining these two scattering methods, and sometimes also with diffractions, it is possible to assess the hierarchical structure of the samples in-situ and nondestructively, ranging from phase identification, quantitative analysis of precipitation structures upto their mesoscopic aggregates, large voids and inclusions. From technical viewpoint, USAXS requires some specific instrumentation for its optics. However, once a reasonable measurement was made, the analysis for the intensity is the same as that for conventional SAXS. In the present article, short introduction of conventional SAXS is presented, and then, the analysis is applied for a couple of USAXS data obtained for well-defined oxide particles whose average diameters are expected to be about 0.3 micrometers. (author)

  3. Diffusion tensor analysis of corpus callosum in progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Shoichi; Makino, Takahiro; Shirai, Wakako; Hattori, Takamichi [Department of Neurology, Graduate School of Medicine, Chiba University (Japan)

    2008-11-15

    Progressive supranuclear palsy (PSP) is a neurodegenerative disease featuring parkinsonism, supranuclear ophthalmoplegia, dysphagia, and frontal lobe dysfunction. The corpus callosum which consists of many commissure fibers probably reflects cerebral cortical function. Several previous reports showed atrophy or diffusion abnormalities of anterior corpus callosum in PSP patients, but partitioning method used in these studies was based on data obtained in nonhuman primates. In this study, we performed a diffusion tensor analysis using a new partitioning method for the human corpus callosum. Seven consecutive patients with PSP were compared with 29 age-matched patients with Parkinson's Disease (PD) and 19 age-matched healthy control subjects. All subjects underwent diffusion tensor magnetic resonance imaging, and the corpus callosum was partitioned into five areas on the mid-sagittal plane according to a recently established topography of human corpus callosum (CC1-prefrontal area, CC2-premotor and supplementary motor area, CC3-motor area, CC4-sensory area, CC5-parietal, temporal, and occipital area). Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) were measured in each area and differences between groups were analyzed. In the PSP group, FA values were significantly decreased in CC1 and CC2, and ADC values were significantly increased in CC1 and CC2. Receiver operating characteristic analysis showed excellent reliability of FA and ADC analyses of CC1 for differentiating PSP from PD. The anterior corpus callosum corresponding to the prefrontal, premotor, and supplementary motor cortices is affected in PSP patients. This analysis can be an additional test for further confirmation of the diagnosis of PSP.

  4. Diffusion tensor analysis of corpus callosum in progressive supranuclear palsy

    International Nuclear Information System (INIS)

    Ito, Shoichi; Makino, Takahiro; Shirai, Wakako; Hattori, Takamichi

    2008-01-01

    Progressive supranuclear palsy (PSP) is a neurodegenerative disease featuring parkinsonism, supranuclear ophthalmoplegia, dysphagia, and frontal lobe dysfunction. The corpus callosum which consists of many commissure fibers probably reflects cerebral cortical function. Several previous reports showed atrophy or diffusion abnormalities of anterior corpus callosum in PSP patients, but partitioning method used in these studies was based on data obtained in nonhuman primates. In this study, we performed a diffusion tensor analysis using a new partitioning method for the human corpus callosum. Seven consecutive patients with PSP were compared with 29 age-matched patients with Parkinson's Disease (PD) and 19 age-matched healthy control subjects. All subjects underwent diffusion tensor magnetic resonance imaging, and the corpus callosum was partitioned into five areas on the mid-sagittal plane according to a recently established topography of human corpus callosum (CC1-prefrontal area, CC2-premotor and supplementary motor area, CC3-motor area, CC4-sensory area, CC5-parietal, temporal, and occipital area). Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) were measured in each area and differences between groups were analyzed. In the PSP group, FA values were significantly decreased in CC1 and CC2, and ADC values were significantly increased in CC1 and CC2. Receiver operating characteristic analysis showed excellent reliability of FA and ADC analyses of CC1 for differentiating PSP from PD. The anterior corpus callosum corresponding to the prefrontal, premotor, and supplementary motor cortices is affected in PSP patients. This analysis can be an additional test for further confirmation of the diagnosis of PSP

  5. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  6. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  7. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  8. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  9. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  10. New analytical techniques for cuticle chemical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schulten, H.R. [Fachhochschule Fresenius, Dept. of Trace Analysis, Wiesbaden (Germany)

    1994-12-31

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author`s integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  11. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  12. Work in progress. Flashing tomosynthesis: a tomographic technique for quantitative coronary angiography.

    Science.gov (United States)

    Woelke, H; Hanrath, P; Schlueter, M; Bleifeld, W; Klotz, E; Weiss, H; Waller, D; von Weltzien, J

    1982-11-01

    Flashing tomosynthesis, a procedure that consists of a recording step and a reconstruction step, facilitates the tomographic imaging of coronary arteries. In a comparative study 10 postmortem coronary arteriograms were examined with 35-mm cine technique and with flashing tomosynthesis. The degrees of stenosis found with both of these techniques were compared with morphometrically obtained values. A higher correlation coefficient existed for the degrees of stenosis obtained with tomosynthesis and morphometry (r = 0.92, p less than 0.001, SEE = 9%) than for those obtained with cine technique and morphometry (r = 0.82, p less than 0.001, SEE = 16%). The technique has also been successfully carried out in 5 patients with coronary artery disease.

  13. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  14. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  15. Wavelet transform techniques and signal analysis

    International Nuclear Information System (INIS)

    Perez, R.B.; Mattingly, J.K.; Tennessee Univ., Knoxville, TN; Perez, J.S.

    1993-01-01

    Traditionally, the most widely used signal analysis tool is the Fourier transform which, by producing power spectral densities (PSDs), allows time dependent signals to be studied in the frequency domain. However, the Fourier transform is global -- it extends over the entire time domain -- which makes it ill-suited to study nonstationary signals which exhibit local temporal changes in the signal's frequency content. To analyze nonstationary signals, the family of transforms commonly designated as short-time Fourier transforms (STFTs), capable of identifying temporally localized changes in the signal's frequency content, were developed by employing window functions to isolate temporal regions of the signal. For example, the Gabor STFT uses a Gaussian window. However, the applicability of STFTs is limited by various inadequacies. The Wavelet transform (NW), recently developed by Grossman and Morlet and explored in depth by Daubechies (2) and Mallat, remedies the inadequacies of STFTs. Like the Fourier transform, the WT can be implemented as a discrete transform (DWT) or as a continuous (integral) transform (CWT). This paper briefly illustrates some of the potential applications of the wavelet transform algorithms to signal analysis

  16. Recent progress on HYSPEC, and its polarization analysis capabilities

    Directory of Open Access Journals (Sweden)

    Winn Barry

    2015-01-01

    Full Text Available HYSPEC is a high-intensity, direct-geometry time-of-flight spectrometer at the Spallation Neutron Source, optimized for measurement of excitations in small single-crystal specimens with optional polarization analysis capabilities. The incident neutron beam is monochromated using a Fermi chopper with short, straight blades, and is then vertically focused by Bragg scattering onto the sample position by either a highly oriented pyrolitic graphite (unpolarized or a Heusler (polarized crystal array. Neutrons are detected by a bank of 3He tubes that can be positioned over a wide range of scattering angles about the sample axis. HYSPEC entered the user program in February 2013 for unpolarized experiments, and is already experiencing a vibrant research program. Polarization analysis will be accomplished by using the Heusler crystal array to polarize the incident beam, and either a 3He spin filter or a supermirror wide-angle polarization analyser to analyse the scattered beam. The 3He spin filter employs the spin-exchange optical pumping technique. A 60∘ wide angle 3He cell that matches the detector coverage will be used for polarization analysis. The polarized gas in the post-sample wide angle cell is designed to be periodically and automatically refreshed with an adjustable pressure of polarized gas, optically pumped in a separate cell and then transferred to the wide angle cell. The supermirror analyser has 960 supermirror polarizers distributed over 60∘, and has been characterized at the Swiss Spallation Neutron Source. The current status of the instrument and the development of its polarization analysis capabilities are presented.

  17. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    Science.gov (United States)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  18. Multivariate techniques of analysis for ToF-E recoil spectrometry data

    Energy Technology Data Exchange (ETDEWEB)

    Whitlow, H.J.; Bouanani, M.E.; Persson, L.; Hult, M.; Jonsson, P.; Johnston, P.N. [Lund Institute of Technology, Solvegatan, (Sweden), Department of Nuclear Physics; Andersson, M. [Uppsala Univ. (Sweden). Dept. of Organic Chemistry; Ostling, M.; Zaring, C. [Royal institute of Technology, Electrum, Kista, (Sweden), Department of Electronics; Johnston, P.N.; Bubb, I.F.; Walker, B.R.; Stannard, W.B. [Royal Melbourne Inst. of Tech., VIC (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Multivariate statistical methods are being developed by the Australian -Swedish Recoil Spectrometry Collaboration for quantitative analysis of the wealth of information in Time of Flight (ToF) and energy dispersive Recoil Spectrometry. An overview is presented of progress made in the use of multivariate techniques for energy calibration, separation of mass-overlapped signals and simulation of ToF-E data. 6 refs., 5 figs.

  19. Techniques in micromagnetic simulation and analysis

    Science.gov (United States)

    Kumar, D.; Adeyeye, A. O.

    2017-08-01

    Advances in nanofabrication now allow us to manipulate magnetic material at micro- and nanoscales. As the steps of design, modelling and simulation typically precede that of fabrication, these improvements have also granted a significant boost to the methods of micromagnetic simulations (MSs) and analyses. The increased availability of massive computational resources has been another major contributing factor. Magnetization dynamics at micro- and nanoscale is described by the Landau-Lifshitz-Gilbert (LLG) equation, which is an ordinary differential equation (ODE) in time. Several finite difference method (FDM) and finite element method (FEM) based LLG solvers are now widely use to solve different kind of micromagnetic problems. In this review, we present a few patterns in the ways MSs are being used in the pursuit of new physics. An important objective of this review is to allow one to make a well informed decision on the details of simulation and analysis procedures needed to accomplish a given task using computational micromagnetics. We also examine the effect of different simulation parameters to underscore and extend some best practices. Lastly, we examine different methods of micromagnetic analyses which are used to process simulation results in order to extract physically meaningful and valuable information.

  20. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  1. Securing safe and informative thoracic CT examinations—Progress of radiation dose reduction techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, Takeshi, E-mail: tkubo@kuhp.kyoto-u.ac.jp [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan); Ohno, Yoshiharu [Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017 (Japan); Advanced Biomedical Imaging Research Center, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017 (Japan); Seo, Joon Beom [Department of Radiology, University of Ulsan College of Medicine, Asan Medical Center, 88 Olympic-ro 43-gil, Songpa-gu, Seoul 05505 (Korea, Republic of); Yamashiro, Tsuneo [Department of Radiology, Graduate School of Medical Science, University of the Ryukyus, 207 Uehara, Nishinara, Okinawa 903-0215 (Japan); Kalender, Willi A. [Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nürnberg, Henkestr. 91, 91052 Erlangen (Germany); Lee, Chang Hyun [Department of Radiology, Seoul National University Hospital, 28 Yeongeon-dong, Jongno-gu, Seoul (Korea, Republic of); Lynch, David A. [Department of Radiology, National Jewish Health, 1400 Jackson St, A330 Denver, Colorado 80206 (United States); Kauczor, Hans-Ulrich [Diagnostic and Interventional Radiology, University Hospital Heidelberg, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Translational Lung Research Center Heidelberg (TLRC), Member of the German Center for Lung Research (DZL), Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Hatabu, Hiroto, E-mail: hhatabu@partners.org [Center for Pulmonary Functional Imaging, Department of Radiology, Brigham and Women' s Hospital, 75 Francis Street, Boston, MA 02115 (United States)

    2017-01-15

    Highlights: • Various techniques have led to substantial radiation dose reduction of chest CT. • Automatic modulation of tube current has been shown to reduce radiation dose. • Iterative reconstruction makes significant radiation dose reduction possible. • Processing time is a limitation for full iterative reconstruction, currently. • Validation of diagnostic accuracy is desirable for routine use of low dose protocols. - Abstract: The increase in the radiation exposure from CT examinations prompted the investigation on the various dose-reduction techniques. Significant dose reduction has been achieved and the level of radiation exposure of thoracic CT is expected to reach the level equivalent to several chest X-ray examinations. With more scanners with advanced dose reduction capability deployed, knowledge on the radiation dose reduction methods has become essential to clinical practice as well as academic research. This article reviews the history of dose reduction techniques, ongoing changes brought by newer technologies and areas of further investigation.

  2. 48 CFR 815.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis techniques. 815.404-1 Section 815.404-1 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... techniques. (a) Contracting officers are responsible for the technical and administrative sufficiency of the...

  3. Canalplasty: the technique and the analysis of its results

    NARCIS (Netherlands)

    van Spronsen, Erik; Ebbens, Fenna A.; Mirck, Peter G. B.; van Wettum, Cathelijne H. M.; van der Baan, Sieberen

    2013-01-01

    To describe the technique for canalplasty as performed in the Academic Medical Center, Amsterdam, the Netherlands and to present the results of this technique. Retrospective chart analysis. Charts of patients who underwent a canalplasty prodedure between 2001 and 2010 were reviewed for indication

  4. Progress in emerging techniques for characterization of immobilized viable whole-cell biocatalysts

    Czech Academy of Sciences Publication Activity Database

    Bučko, M.; Vikartovská, A.; Schenkmayerová, A.; Tkáč, J.; Filip, J.; Chorvát Jr., D.; Neděla, Vilém; Ansorge-Schumacher, M.B.; Gemeiner, P.

    2017-01-01

    Roč. 71, č. 11 (2017), s. 2309-2324 ISSN 0366-6352 Institutional support: RVO:68081731 Keywords : bioelectrocatalysis * imaging techniques * immobilized whole-cell biocatalyst * multienzyme cascade reactions * online kinetics Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering OBOR OECD: Bioprocessing technologies (industrial processes relying on biological agents to drive the process) biocatalysis, fermentation Impact factor: 1.258, year: 2016

  5. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  6. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  7. New analysis technique for K-edge densitometry spectra

    International Nuclear Information System (INIS)

    Hsue, Sin-Tao; Collins, M.L.

    1995-01-01

    A method for simulating absorption edge densitometry has been developed. This program enables one to simulate spectra containing any combination of special nuclear materials (SNM) in solution. The method has been validated with an analysis method using a single SNM in solution or a combination of two types of SNM separated by a Z of 2. A new analysis technique for mixed solutions has been developed. This new technique has broader applications and eliminates the need for bias correction

  8. No evidence of real progress in treatment of acute pain, 1993–2012: scientometric analysis

    Directory of Open Access Journals (Sweden)

    Correll DJ

    2014-04-01

    Full Text Available Darin J Correll, Kamen V Vlassakov, Igor Kissin Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA Abstract: Over the past 2 decades, many new techniques and drugs for the treatment of acute pain have achieved widespread use. The main aim of this study was to assess the progress in their implementation using scientometric analysis. The following scientometric indices were used: 1 popularity index, representing the share of articles on a specific technique (or a drug relative to all articles in the field of acute pain; 2 index of change, representing the degree of growth in publications on a topic compared to the previous period; and 3 index of expectations, representing the ratio of the number of articles on a topic in the top 20 journals relative to the number of articles in all (>5,000 biomedical journals covered by PubMed. Publications on specific topics (ten techniques and 21 drugs were assessed during four time periods (1993–1997, 1998–2002, 2003–2007, and 2008–2012. In addition, to determine whether the status of routine acute pain management has improved over the past 20 years, we analyzed surveys designed to be representative of the national population that reflected direct responses of patients reporting pain scores. By the 2008–2012 period, popularity index had reached a substantial level (≥5% only with techniques or drugs that were introduced 30–50 years ago or more (epidural analgesia, patient-controlled analgesia, nerve blocks, epidural analgesia for labor or delivery, bupivacaine, and acetaminophen. In 2008–2012, promising (although modest changes of index of change and index of expectations were found only with dexamethasone. Six national surveys conducted for the past 20 years demonstrated an unacceptably high percentage of patients experiencing moderate or severe pain with not even a trend toward outcome improvement. Thus

  9. Application of accident progression event tree technology to the Savannah River Site Defense Waste Processing Facility SAR analysis

    International Nuclear Information System (INIS)

    Brandyberry, M.D.; Baker, W.H.; Wittman, R.S.; Amos, C.N.

    1993-01-01

    The Accident Analysis in the Safety Analysis Report (SAR) for the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) has recently undergone an upgrade. Non-reactor SARs at SRS (and other Department of Energy (DOE) sites) use probabilistic techniques to assess the frequency of accidents at their facilities. This paper describes the application of an extension of the Accident Progression Event Tree (APET) approach to accidents at the SRS DWPF. The APET technique allows an integrated model of the facility risk to be developed, where previous probabilistic accident analyses have been limited to the quantification of the frequency and consequences of individual accident scenarios treated independently. Use of an APET allows a more structured approach, incorporating both the treatment of initiators that are common to more than one accident, and of accident progression at the facility

  10. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  11. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  12. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  13. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  14. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  15. Visual and statistical analysis of 18F-FDG PET in primary progressive aphasia

    International Nuclear Information System (INIS)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge; Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis

    2015-01-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  16. A brain impact stress analysis using advanced discretization meshless techniques.

    Science.gov (United States)

    Marques, Marco; Belinha, Jorge; Dinis, Lúcia Maria Js; Natal Jorge, Renato

    2018-03-01

    This work has the objective to compare the mechanical behaviour of a brain impact using an alternative numerical meshless technique. Thus, a discrete geometrical model of a brain was constructed using medical images. This technique allows to achieve a discretization with realistic geometry, allowing to define locally the mechanical properties according to the medical images colour scale. After defining the discrete geometrical model of the brain, the essential and natural boundary conditions were imposed to reproduce a sudden impact force. The analysis was performed using the finite element analysis and the radial point interpolation method, an advanced discretization technique. The results of both techniques are compared. When compared with the finite element analysis, it was verified that meshless methods possess a higher convergence rate and that they are capable of producing smoother variable fields.

  17. Glaucomatous progression in the retinal nerve fibre and retinal ganglion cell-inner plexiform layers determined using optical coherence tomography-guided progression analysis.

    Science.gov (United States)

    Hwang, Young Hoon; Kim, Yeji; Chung, Jae Keun; Lee, Kwan Bok

    2018-02-01

    To investigate the characteristics of glaucomatous progression in circumpapillary retinal nerve fibre layer (RNFL) and macular retinal ganglion cell-inner plexiform layer (GCIPL) determined using optical coherence tomography-guided progression analysis (OCT-GPA). Serial OCT images of 527 glaucomatous eyes with greater than four OCT tests were screened. Among them, 106 (20.1 per cent) eyes with progression in either RNFL or GCIPL determined using OCT-GPA were included. Based on the agreement of progression detection between RNFL and GCIPL, the eyes were classified into the 'RNFL progression earlier group', 'GCIPL progression earlier group', or 'simultaneous progression group'. The type of progression was classified as diffuse, localised or mixed. Among the 106 eyes with progression, 100 (94.3 per cent) showed RNFL progression and 83 (78.3 per cent) showed GCIPL progression. Fifty-four (50.9 per cent), 13 (12.3 per cent), and 39 (36.8 per cent) eyes were classified into the RNFL progression earlier group, GCIPL progression earlier group, and simultaneous progression group, respectively. Diffuse-type progression was found in three (three per cent) eyes with RNFL progression and 32 (38.6 per cent) eyes with GCIPL progression. The most common location of progression was the 7 o'clock sector (42.0 per cent) in the RNFL and the inferotemporal sector (39.8 per cent) in the GCIPL. The most common characteristic of RNFL and GCIPL progression determined using OCT-GPA was localised thinning in the inferotemporal area. Progression was more frequently found in the RNFL than in the GCIPL, and diffuse-type progression was more frequent in the GCIPL than in the RNFL. © 2018 Optometry Australia.

  18. African violet (Saintpaulia ionantha H. Wendl.: classical breeding and progress in the application of biotechnological techniques

    Directory of Open Access Journals (Sweden)

    Silva Jaime A. Teixeira da

    2017-12-01

    Full Text Available As a result of its domestication, breeding and subsequent commercialization, African violet (Saintpaulia ionantha H. Wendl. has become the most famous and popular Saintpaulia species. There is interest in producing cultivars that have increased resistance to pests and low temperature, in the introduction of novel horticultural characteristics such as leaf shape, flower colour, size and form, and in improved productivity and enhanced flower duration in planta. In African violet, techniques such as the application of chemical mutagens (ethylmethanesulfonate, N-nitroso-N-methylurea, radiation (gamma (γ-rays, X-rays, carbon ion beams and colchicine have been successfully applied to induce mutants. Among these techniques, γ radiation and colchicine have been the most commonly applied mutagens. This review offers a short synthesis of the advances made in African violet breeding, including studies on mutation and somaclonal variation caused by physical and chemical factors, as well as transgenic strategies using Agrobacterium-mediated transformation and particle bombardment. In African violet, Agrobacterium-mediated transformation is affected by the Agrobacterium strain, selection marker, and cutting-induced wounding stress. Somaclonal variation, which arises in tissue cultures, can be problematic in maintaining true-to-type clonal material, but may be a useful tool for obtaining variation in flower colour. The only transgenic African violet plants generated to date with horticulturally useful traits are tolerant to boron (heavy metal stress, or bear a glucanase-chitinase gene.

  19. Progress of new label-free techniques for biosensors: a review.

    Science.gov (United States)

    Sang, Shengbo; Wang, Yajun; Feng, Qiliang; Wei, Ye; Ji, Jianlong; Zhang, Wendong

    2016-01-01

    The detection techniques used in biosensors can be broadly classified into label-based and label-free. Label-based detection relies on the specific properties of labels for detecting a particular target. In contrast, label-free detection is suitable for the target molecules that are not labeled or the screening of analytes which are not easy to tag. Also, more types of label-free biosensors have emerged with developments in biotechnology. The latest developed techniques in label-free biosensors, such as field-effect transistors-based biosensors including carbon nanotube field-effect transistor biosensors, graphene field-effect transistor biosensors and silicon nanowire field-effect transistor biosensors, magnetoelastic biosensors, optical-based biosensors, surface stress-based biosensors and other type of biosensors based on the nanotechnology are discussed. The sensing principles, configurations, sensing performance, applications, advantages and restriction of different label-free based biosensors are considered and discussed in this review. Most concepts included in this survey could certainly be applied to the development of this kind of biosensor in the future.

  20. Analysis of molecular changes during human melanocytic tumor progression.

    NARCIS (Netherlands)

    Wit, N.J. de

    2005-01-01

    Melanoma is one of the most aggressive types of cancer, due to its potency to disseminate early in tumor progression. The incidence is still rising, even though the rate of change has leveled off in the last decade. As melanoma cells are relatively insensitive to classical systemic therapies, like

  1. Progressive failure analysis of fibrous composite materials and structures

    Science.gov (United States)

    Bahei-El-din, Yehia A.

    1990-01-01

    A brief description is given of the modifications implemented in the PAFAC finite element program for the simulation of progressive failure in fibrous composite materials and structures. Details of the memory allocation, input data, and the new subroutines are given. Also, built-in failure criteria for homogeneous and fibrous composite materials are described.

  2. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  3. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  4. Regional environmental analysis and management: New techniques for current problems

    Science.gov (United States)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  5. [TXRF technique and quantitative analysis of mollusc teeth].

    Science.gov (United States)

    Tian, Y; Liu, K; Wu, X; Zheng, S

    1999-06-01

    Total reflection X-ray fluorescence (TXRF) analysis technique and the instrument with a short path, high efficiency, low power and small volume are briefly presented. The detection limit of the system are at pg-level for Cu and Mo target excitation. Teeth of a marine mollusc were measured quantitatively and the spectrum and analysis results were given.

  6. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  7. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  8. Assessment of Intralaminar Progressive Damage and Failure Analysis Using an Efficient Evaluation Framework

    Science.gov (United States)

    Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl

    2017-01-01

    Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.

  9. Guidelines for VCCT-Based Interlaminar Fatigue and Progressive Failure Finite Element Analysis

    Science.gov (United States)

    Deobald, Lyle R.; Mabson, Gerald E.; Engelstad, Steve; Prabhakar, M.; Gurvich, Mark; Seneviratne, Waruna; Perera, Shenal; O'Brien, T. Kevin; Murri, Gretchen; Ratcliffe, James; hide

    2017-01-01

    This document is intended to detail the theoretical basis, equations, references and data that are necessary to enhance the functionality of commercially available Finite Element codes, with the objective of having functionality better suited for the aerospace industry in the area of composite structural analysis. The specific area of focus will be improvements to composite interlaminar fatigue and progressive interlaminar failure. Suggestions are biased towards codes that perform interlaminar Linear Elastic Fracture Mechanics (LEFM) using Virtual Crack Closure Technique (VCCT)-based algorithms [1,2]. All aspects of the science associated with composite interlaminar crack growth are not fully developed and the codes developed to predict this mode of failure must be programmed with sufficient flexibility to accommodate new functional relationships as the science matures.

  10. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  11. Progress report on reversal and substitute element technique for thread calibration on CMMs

    DEFF Research Database (Denmark)

    Carmignato, Simone; Larsen, Erik; Sobiecki, Rene

    This report is made as a part of the project EASYTRAC, an EU project under the programme Competitive and Sustainable Growth: Contract No. G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines. For thi......This report is made as a part of the project EASYTRAC, an EU project under the programme Competitive and Sustainable Growth: Contract No. G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines......) - Germany and Tampere University of Technology (TUT) - Finland. The present report describes feasibility and preliminary results of a reversal and substitute element technique application for thread calibration....

  12. Managing Software Project Risks (Analysis Phase) with Proposed Fuzzy Regression Analysis Modelling Techniques with Fuzzy Concepts

    OpenAIRE

    Elzamly, Abdelrafe; Hussin, Burairah

    2014-01-01

    The aim of this paper is to propose new mining techniques by which we can study the impact of different risk management techniques and different software risk factors on software analysis development projects. The new mining technique uses the fuzzy multiple regression analysis techniques with fuzzy concepts to manage the software risks in a software project and mitigating risk with software process improvement. Top ten software risk factors in analysis phase and thirty risk management techni...

  13. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  14. Survival trees: an alternative non-parametric multivariate technique for life history analysis.

    Science.gov (United States)

    De Rose, A; Pallara, A

    1997-01-01

    "In this paper an extension of tree-structured methodology to cover censored survival analysis is discussed.... The tree-shaped diagram...can be used to draw meaningful patterns of behaviour throughout the individual life history.... The fundamentals of tree methodology are outlined; [then] an application of the technique to real data from a survey on the progression to marriage among adult women in Italy is illustrated; [and] some comments are presented on the main advantages and problems related to tree-structured methodology for censored survival analysis." (EXCERPT)

  15. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically....... The efficiency of the Random Decrement technique for the estimation of correlation functions is compared to other equivalent methods (FFT, Direct method). It is shown that the Random Decrement technique can be as much as a hundred times faster than other methods. The theory behind the Random Decrement technique...... is expanded to include both a vector formulation that increases speed considerably, and a new method for the prediction of the variance of the estimated Random Decrement functions. The thesis closes with a number of examples of modal analysis of bridges exposed to natural (ambient) load....

  16. Thermodynamic Activity-Based Progress Curve Analysis in Enzyme Kinetics.

    Science.gov (United States)

    Pleiss, Jürgen

    2018-03-01

    Macrokinetic Michaelis-Menten models based on thermodynamic activity provide insights into enzyme kinetics because they separate substrate-enzyme from substrate-solvent interactions. Kinetic parameters are estimated from experimental progress curves of enzyme-catalyzed reactions. Three pitfalls are discussed: deviations between thermodynamic and concentration-based models, product effects on the substrate activity coefficient, and product inhibition. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Analysis of interventional therapy for progressing stage gastric cancer

    International Nuclear Information System (INIS)

    Zhu Mingde; Zhang Zijing; Ji Hongsheng; Ge Chenlin; Hao Gang; Wei Kongming; Yuan Yuhou; Zhao Xiuping

    2008-01-01

    Objective: To investigate the interventional therapy and its curative effect for progressing stage gastric cancer. Methods: two hundred and twelve patients with progressing stage gastric cancer were treated with arterial perfusion and arterial embolization. Gastric cardia cancer was treated through the left gastric artery and the left inferior phrenic artery or splenic artery. Cancers of lesser and greater gastric curvature was treated either through the left and right gastric arteries or common hepatic artery or through gastroduodenal artery, right gastroomental artery or splenic artery. Gastric antrum cancers were perfused through gastroduodenal artery or after the middle segmental embolization of right gastroomental artery. Results: One hundred and ninety three cases undergone interventional management were followed up. The CR + PR of gastric cardia cancer was 53.13%; gastric body cancer 44.44%; gastric antrum cancer 10%; recurrent cancer and remnant gastric cancer 0. There was no significant difference in outcome between gastric cardia cancer and gastric body cancer (P>0.05) but significant differences were shown both between gastric cardia cancer and gastric antrum cancer, and between gastric body cancer and gastric antrum cancer (P<0.05), with 1 year and 2 years survival rates of 81% and 56% respectively. Conclusion: The interventional therapeutic effect of progressing stage gastric cancers is different due to the different sites of the lesions in the gastric tissue. The curative effect of gastric cardia cancer and gastric body cancer is better than that of gastric antrum cancer, recurrent cancer and remnant gastric cancer. (authors)

  18. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  19. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  20. Image analysis techniques for the study of turbulent flows

    Science.gov (United States)

    Ferrari, Simone

    In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively "low cost" techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  1. Image analysis techniques for the study of turbulent flows

    Directory of Open Access Journals (Sweden)

    Ferrari Simone

    2017-01-01

    Full Text Available In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively “low cost” techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  2. Critical analysis of procurement techniques in construction management sectors

    Science.gov (United States)

    Tiwari, Suman Tiwari Suresh; Chan, Shiau Wei; Faraz Mubarak, Muhammad

    2018-04-01

    Over the last three decades, numerous procurement techniques have been one of the highlights of the Construction Management (CM) for ventures, administration contracting, venture management as well as design and construct. Due to the development and utilization of those techniques, various researchers have explored the criteria for their choice and their execution in terms of time, cost and quality. Nevertheless, there is a lack of giving an account on the relationship between the procurement techniques and the progressed related issues, for example, supply chain, sustainability, innovation and technology development, lean construction, constructability, value management, Building Information Modelling (BIM) as well as e-procurement. Through chosen papers from the reputable CM-related academic journals, the specified scopes of these issues are methodically assessed with the objective to explore the status and trend in procurement related research. The result of this paper contributes theoretically as well as practically to the researchers and industrialist in order to be aware and appreciate the development of procurement techniques.

  3. An integrated technique for the analysis of skin bite marks.

    Science.gov (United States)

    Bernitz, Herman; Owen, Johanna H; van Heerden, Willie F P; Solheim, Tore

    2008-01-01

    The high number of murder, rape, and child abuse cases in South Africa has led to increased numbers of bite mark cases being heard in high courts. Objective analysis to match perpetrators to bite marks at crime scenes must be able to withstand vigorous cross-examination to be of value in conviction of perpetrators. An analysis technique is described in four stages, namely determination of the mark to be a human bite mark, pattern association analysis, metric analysis and comparison with the population data, and illustrated by a real case study. New and accepted techniques are combined to determine the likelihood ratio of guilt expressed as one of a range of conclusions described in the paper. Each stage of the analysis adds to the confirmation (or rejection) of concordance between the dental features present on the victim and the dentition of the suspect. The results illustrate identification to a high degree of certainty.

  4. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  5. Mass spectrometric analysis of gingival crevicular fluid biomarkers can predict periodontal disease progression.

    Science.gov (United States)

    Ngo, L H; Darby, I B; Veith, P D; Locke, A G; Reynolds, E C

    2013-06-01

    Gingival crevicular fluid has been suggested as a possible source of biomarkers for periodontal disease progression. This paper describes a technique for the analysis of gingival crevicular fluid from individual sites using mass spectrometry. It explores the novel use of mass spectrometry to examine the relationship between the relative amounts of proteins and peptides in gingival crevicular fluid and their relationship with clinical indices and periodontal attachment loss in periodontal maintenance patients. The aim of this paper was to assess whether the mass spectrometric analysis of gingival crevicular fluid may allow for the site-specific prediction of periodontal disease progression. Forty-one periodontal maintenance subjects were followed over 12 mo, with clinical measurements taken at baseline and every 3 mo thereafter. Gingival crevicular fluid was collected from subjects at each visit and was analysed using matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) mass spectrometry. Samples were classified based upon pocket depth, modified gingival index (MGI), plaque index and attachment loss, and were analysed within these groups. A genetic algorithm was used to create a model based on pattern analysis to predict sites undergoing attachment loss. Three hundred and eighty-five gingival crevicular fluid samples were analysed. Twenty-five sites under observation in 14 patients exhibited attachment loss of > 2 mm over the 12-mo period. The clinical indices pocket depth, MGI, plaque levels and bleeding on probing served as poor discriminators of gingival crevicular fluid mass spectra. Models generated from the gingival crevicular fluid mass spectra could predict attachment loss at a site with a high specificity (97% recognition capability and 67% cross-validation). Gingival crevicular fluid mass spectra could be used to predict sites with attachment loss. The use of algorithm-generated models based on gingival crevicular fluid mass spectra may

  6. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  7. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  8. Supracapsular glued intraocular lens in progressive subluxated cataracts: Technique to retain an intact vitreous face.

    Science.gov (United States)

    Jacob, Soosan; Narasimhan, Smita; Agarwal, Amar; Mazzotta, Cosimo; Rechichi, Miguel; Agarwal, Athiya

    2017-03-01

    We describe a technique to prevent late intraocular lens (IOL) subluxation and dislocation that can be associated with progressive zonulopathy. Supracapsular glued IOL fixation is done to retain an intact anterior hyaloid face and avoid vitreous disturbance while providing stable long-term IOL fixation. Phacoemulsification is followed by glued IOL implantation above intact anterior and posterior capsules. Sclerotomies are created ab interno in a supracapsular plane under diametrically opposite lamellar scleral flaps without entering the vitreous cavity. Haptics are externalized in the supracapsular plane and tucked into intrascleral tunnels. Intraoperative or postoperative posterior capsulorhexis or capsulotomy and anterior capsule relaxing cuts can prevent capsule phimosis. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  9. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  10. In silico regulatory analysis for exploring human disease progression

    Directory of Open Access Journals (Sweden)

    DeLisi Charles

    2008-06-01

    Full Text Available Abstract Background An important goal in bioinformatics is to unravel the network of transcription factors (TFs and their targets. This is important in the human genome, where many TFs are involved in disease progression. Here, classification methods are applied to identify new targets for 152 transcriptional regulators using publicly-available targets as training examples. Three types of sequence information are used: composition, conservation, and overrepresentation. Results Starting with 8817 TF-target interactions we predict an additional 9333 targets for 152 TFs. Randomized classifiers make few predictions (~2/18660 indicating that our predictions for many TFs are significantly enriched for true targets. An enrichment score is calculated and used to filter new predictions. Two case-studies for the TFs OCT4 and WT1 illustrate the usefulness of our predictions: • Many predicted OCT4 targets fall into the Wnt-pathway. This is consistent with known biology as OCT4 is developmentally related and Wnt pathway plays a role in early development. • Beginning with 15 known targets, 354 predictions are made for WT1. WT1 has a role in formation of Wilms' tumor. Chromosomal regions previously implicated in Wilms' tumor by cytological evidence are statistically enriched in predicted WT1 targets. These findings may shed light on Wilms' tumor progression, suggesting that the tumor progresses either by loss of WT1 or by loss of regions harbouring its targets. • Targets of WT1 are statistically enriched for cancer related functions including metastasis and apoptosis. Among new targets are BAX and PDE4B, which may help mediate the established anti-apoptotic effects of WT1. • Of the thirteen TFs found which co-regulate genes with WT1 (p ≤ 0.02, 8 have been previously implicated in cancer. The regulatory-network for WT1 targets in genomic regions relevant to Wilms' tumor is provided. Conclusion We have assembled a set of features for the targets of

  11. Multidimensional scaling technique for analysis of magnetic storms ...

    Indian Academy of Sciences (India)

    R.Narasimhan(krishtel emaging) 1461 1996 Oct 15 13:05:22

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependence of geomagnetic field ..... at best an approximation of the real situation but still it may contain a surprising amount of useful .... (oscillations) is a function of latitude and local time. Close to the dip equator just south of Trivan-.

  12. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  13. Study and analysis of wavelet based image compression techniques ...

    African Journals Online (AJOL)

    This paper presented comprehensive study with performance analysis of very recent Wavelet transform based image compression techniques. Image compression is one of the necessities for such communication. The goals of image compression are to minimize the storage requirement and communication bandwidth.

  14. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  15. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  16. Metric Distance Ranking Technique for Fuzzy Critical Path Analysis ...

    African Journals Online (AJOL)

    In this paper, fuzzy critical path analysis of a project network is carried out. Metric distance ranking technique is used to order fuzzy numbers during the forward and backward pass computations to obtain the earliest start, earliest finish, latest start and latest finish times of the project's activities. A numerical example is ...

  17. Technologies and microstructures for separation techniques in chemical analysis

    NARCIS (Netherlands)

    Spiering, Vincent L.; Spiering, V.L.; Lammerink, Theodorus S.J.; Jansen, Henricus V.; van den Berg, Albert; Fluitman, J.H.J.

    1996-01-01

    The possibilities for microtechnology in chemical analysis and separation techniques are discussed. The combination of the materials and the dimensions of structures can limit the sample and waste volumes on the one hand, but also increases the performance of the chemical systems. Especially in high

  18. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  19. Evaluation of Analysis Techniques for Fluted-Core Sandwich Cylinders

    Science.gov (United States)

    Lovejoy, Andrew E.; Schultz, Marc R.

    2012-01-01

    Buckling-critical launch-vehicle structures require structural concepts that have high bending stiffness and low mass. Fluted-core, also known as truss-core, sandwich construction is one such concept. In an effort to identify an analysis method appropriate for the preliminary design of fluted-core cylinders, the current paper presents and compares results from several analysis techniques applied to a specific composite fluted-core test article. The analysis techniques are evaluated in terms of their ease of use and for their appropriateness at certain stages throughout a design analysis cycle (DAC). Current analysis techniques that provide accurate determination of the global buckling load are not readily applicable early in the DAC, such as during preliminary design, because they are too costly to run. An analytical approach that neglects transverse-shear deformation is easily applied during preliminary design, but the lack of transverse-shear deformation results in global buckling load predictions that are significantly higher than those from more detailed analysis methods. The current state of the art is either too complex to be applied for preliminary design, or is incapable of the accuracy required to determine global buckling loads for fluted-core cylinders. Therefore, it is necessary to develop an analytical method for calculating global buckling loads of fluted-core cylinders that includes transverse-shear deformations, and that can be easily incorporated in preliminary design.

  20. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  1. Techniques for getting the most from an evaluation: Review of methods and results for attributing progress, non-energy benefits, net to gross, and cost-benefit

    International Nuclear Information System (INIS)

    Skumatz, Lisa A.

    2005-01-01

    As background for several evaluation and attribution projects, the authors conducted research on best practices in a few key areas of evaluation. We focused on techniques used in measuring market progress, enhanced techniques in attributing net energy impacts, and examining omitted program effects, particularly net non-energy benefits. The research involved a detailed literature review, interviews with program managers and evaluators across the US, and refinements of techniques used by the authors in conducting evaluation work. The object of the research was to uncover successful (and unsuccessful) approaches being used for key aspects of evaluation work. The research uncovered areas of tracking that are becoming more commonly used by agencies to assess progress in the market. In addition, detailed research by the authors on a number of impact and attribution evaluations have also led to recommendations on key practices that we believe comprise elements of best practices for assessments of attributable program effects. Specifically, we have identified a number of useful steps to improve the attribution of impacts to program interventions. Information on techniques for both attribution/causality work for a number of programs are presented - including market transformation programs that rely on marketing, advertising, training, and mid-stream incentives and work primarily with a network of participating mid-market actors. The project methods and results are presented and include: Theory-based evaluation, indicators, and hypothesis testing; Enhanced measurement of free riders, spillover, and other effects, and attribution of impacts using distribution and ranges of measure and intervention impacts, rather than less reliable point estimates; Attribution of program-induced non-energy benefits; Net to gross, benefit cost analysis, and incorporation of scenario/risk analysis of results; Comparison of net to gross results across program types to explore patterns and

  2. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  3. Extracted image analysis: a technique for deciphering mediated portrayals.

    Science.gov (United States)

    Berg, D H; Coutts, L B

    1995-01-01

    A technique for analyzing print media that we have developed as a consequence of our interest in the portrayal of women in menstrual product advertising is reported. The technique, which we call extracted image analysis, involves a unique application of grounded theory and the concomitant heuristic use of the concept of ideal type (Weber, 1958). It provides a means of heuristically conceptualizing the answer to a variant of the "What is going on here?" question asked in analysis of print communication, that is, "Who is being portrayed/addressed here?" Extracted image analysis involves the use of grounded theory to develop ideal typologies. Because the technique re-constructs the ideal types embedded in a communication, it possesses considerable potential as a means of identifying the profiles of members of identifiable groups held by the producers of the directed messages. In addition, the analysis of such portrayals over time would be particularly well suited to extracted image analysis. A number of other possible applications are also suggested.

  4. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  5. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  6. Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer

    Science.gov (United States)

    2015-09-01

    Assay Kits respectively on the Qubit 2.0 Fluorometer (Life Technologies). The BioRad Experion Automated Electrophoresis System RNA kit was used to...AWARD NUMBER: W81XWH-14-1-0080 TITLE: Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer. PRINCIPAL INVESTIGATOR...Aug 2015 4. TITLE AND SUBTITLE Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer. 5a. CONTRACT NUMBER 5b. GRANT

  7. FINE-GRAINEDCELLULAR CONCRETE CREEP ANALYSIS TECHNIQUE WITH CONSIDERATION FORCARBONATION

    Directory of Open Access Journals (Sweden)

    M. A. Gaziev

    2015-01-01

    Full Text Available The article considers the creep and creep deformation analysis technique in fine-grainedcellular concrete with consideration for carbonation and assurance requirements for the repairing properties and seismic stability. The procedure for determining the creep of fine-grainedcellular concrete is proposed with account of its carbonationby atmospheric carbon dioxide. It has been found theoretically and experimentally that the proposed technique allows obtaining reproducible results and can be recommended for creep determination of fine-grainedcellular concretes, including repairingones, taking into account their carbonation.

  8. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  9. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  10. Evaluation of Damping Using Frequency Domain Operational Modal Analysis Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Georgakis, Christos T.; Brincker, Rune

    2015-01-01

    separated and closely spaced modes. Finally, the results of the numerical study are presented, in which the error of the structural damping estimates obtained by each OMA technique is shown for a range of damping levels. From this, it is clear that there are notable differences in accuracy between......Operational Modal Analysis (OMA) techniques provide in most cases reasonably accurate estimates of structural frequencies and mode shapes. In contrast though, they are known to often produce uncertain structural damping estimates, which is mainly due to inherent random and/or bias errors...

  11. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  12. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  13. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  14. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  15. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  16. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  17. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  18. Analysis of genetic copy number changes in cervical disease progression

    International Nuclear Information System (INIS)

    Policht, Frank A; Song, Minghao; Sitailo, Svetlana; O'Hare, Anna; Ashfaq, Raheela; Muller, Carolyn Y; Morrison, Larry E; King, Walter; Sokolova, Irina A

    2010-01-01

    Cervical dysplasia and tumorigenesis have been linked with numerous chromosomal aberrations. The goal of this study was to evaluate 35 genomic regions associated with cervical disease and to select those which were found to have the highest frequency of aberration for use as probes in fluorescent in-situ hybridization. The frequency of gains and losses using fluorescence in-situ hybridization were assessed in these 35 regions on 30 paraffin-embedded cervical biopsy specimens. Based on this assessment, 6 candidate fluorescently labeled probes (8q24, Xp22, 20q13, 3p14, 3q26, CEP15) were selected for additional testing on a set of 106 cervical biopsy specimens diagnosed as Normal, CIN1, CIN2, CIN3, and SCC. The data were analyzed on the basis of signal mean, % change of signal mean between histological categories, and % positivity. The study revealed that the chromosomal regions with the highest frequency of copy number gains and highest combined sensitivity and specificity in high-grade cervical disease were 8q24 and 3q26. The cytological application of these two probes was then evaluated on 118 ThinPrep™ samples diagnosed as Normal, ASCUS, LSIL, HSIL and Cancer to determine utility as a tool for less invasive screening. Using gains of either 8q24 or 3q26 as a positivity criterion yielded specificity (Normal +LSIL+ASCUS) of 81.0% and sensitivity (HSIL+Cancer) of 92.3% based on a threshold of 4 positive cells. The application of a FISH assay comprised of chromosomal probes 8q24 and 3q26 to cervical cytology specimens confirms the positive correlation between increasing dysplasia and copy gains and shows promise as a marker in cervical disease progression

  19. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  20. Progressive-Ratio Schedules and Applied Behavior Analysis

    Science.gov (United States)

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  1. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  2. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between......This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group...

  3. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  4. Contributions to flow techniques and mass spectrometry in water analysis

    OpenAIRE

    Santos, Inês Carvalho dos

    2015-01-01

    In this thesis, the use of different flow systems was exploited along with the use of different detection techniques for the development of simple, robust, and automated analytical procedures. With the purpose to perform in-line sample handling and pretreatment operations, different separation units were used. The main target for these methods was waters samples. The first procedure was based on a sequential injection analysis (SIA) system for carbon speciation (alkalinity, dis...

  5. Analysis of Indian silver coins by EDXRF technique

    International Nuclear Information System (INIS)

    Tripathy, B.B.; Rautray, T.R.; Das, Satya R.; Das, Manas R.; Vijayan, V.

    2009-01-01

    The analysis of some of the Indian silver coins during British rule were analysed by Energy Dispersive X-Ray Fluorescence Technique. Eight elements namely Cr, Fe, Ni, Cu, Zn, As, Ag and Pb were estimated in this study which also seems to indicate the fragmentation as well as the impoverishment of the power for the regimes that had produced the studied coins. While Cu and Ag were present as major elements, other elements were found to be present in minor concentration. (author)

  6. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  7. [Progress on Determination and Analysis of Zopiclone in Biological Samples].

    Science.gov (United States)

    Shu, C X; Gong, D; Zhang, L P; Zhao, J X

    2017-12-01

    As a new hypnotic, zopiclone is widely used in clinical treatment. There are many methods for determination of zopiclone, including spectrophotometry, chromatography and chromatography mass spectrum, etc. Present paper reviews different kinds of biological samples associated with zopiclone, extraction and purification methods, and determination and analysis methods, which aims to provide references for the relevant research and practice. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  8. PROGRESS IN SIFT-MS: BREATH ANALYSIS AND OTHER APPLICATIONS

    Czech Academy of Sciences Publication Activity Database

    Španěl, Patrik; Smith, D.

    2011-01-01

    Roč. 30, č. 2 (2011), s. 236-267 ISSN 0277-7037 R&D Projects: GA MPO FT-TA4/124; GA ČR GA202/09/0800; GA ČR GA203/09/0256 Institutional research plan: CEZ:AV0Z40400503 Keywords : SIFT-MS * breath analysis * ion flow tube Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 10.461, year: 2011

  9. Meta-analysis of surgical techniques for preventing parotidectomy sequelae.

    Science.gov (United States)

    Curry, Joseph M; King, Nancy; Reiter, David; Fisher, Kyle; Heffelfinger, Ryan N; Pribitkin, Edmund A

    2009-01-01

    To conduct a meta-analysis of the literature on surgical methods for the prevention of Frey syndrome and concave facial deformity after parotidectomy. A PubMed search through February 2008 identified more than 60 English-language studies involving surgical techniques for prevention of these parameters. Analyzed works included 15 retrospective or prospective controlled studies reporting quantitative data for all included participants for 1 or more of the measured parameters in patients who had undergone parotidectomy. Report quality was assessed by the strength of taxonomy recommendation (SORT) score. Data were directly extracted from reports and dichotomized into positive and negative outcomes. The statistical significance was then calculated. The mean SORT score for all studies was 2.34, and the mean SORT score for all the analyzed studies was 1.88. Meta-analysis for multiple techniques to prevent symptomatic Frey syndrome, positive starch-iodine test results, and contour deformity favored intervention with a cumulative odds ratio (OR) of 3.88 (95% confidence interval [CI], 2.81-5.34); OR, 3.66 (95% CI; 2.32-5.77); and OR, 5.25 (95% CI, 3.57-7.72), respectively. Meta-analysis of operative techniques to prevent symptomatic Frey syndrome, positive starch-iodine test results, and facial asymmetry suggests that such methods are likely to reduce the incidence of these complications after parotidectomy.

  10. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... and simplifcation of the design of practical vision systems....... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...

  11. ENEA initiatives in Southern Italy: Progress report, analysis, prospects

    International Nuclear Information System (INIS)

    Santandrea, E.

    1991-01-01

    In the past, technological development in Italy was concentrated in the country's heavily industrialized northern regions. The motive for this choice was the conception that to be successful in a highly competitive market, research investment had necessarily to favour those developed areas with an already proven capacity for guaranteed fast and high returns. Unfortunately this policy has created a technologically and economically depressed area, known as Mezzogiorno, in southern Italy. Within the framework of new national energy and economic policies calling for balanced economic and technological development, ENEA (Italian Commission for New Technologies, Energy and the Environment) has been entrusted with the planning and managing of research, commercialization and technology transfer programs designed to stimulate high-technology industrial activity in Italy's southern regions so as to allow them to become more competitive in the upcoming European free trade market. Small business concerns shall be favoured in this new development scheme which shall respect the existing local social-economic framework. Emphasis shall be placed on privileging such elements as quality, flexibility and versatility, as opposed to lost cost mass production. Priority is to be given to the development of renewable energy sources, energy conservation techniques and environmentally compatible technologies

  12. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  13. Application of Microfluidic Techniques to Pyrochemical Salt Sampling and Analysis

    International Nuclear Information System (INIS)

    Pereira, C.; Launiere, C.; Smith, N.

    2015-01-01

    Microfluidic techniques enable production of micro-samples of molten salt for analysis by at-line and off-line sensors and detectors. These sampling systems are intended for implementation in an electrochemical used fuel treatment facility as part of the material balance and control system. Microfluidics may reduce random statistical error associated with sampling inhomogeneity because a large number of uniform sub-microlitre droplets may be generated and successively analyzed. The approach combines two immiscible fluids in a microchannel under laminar flow conditions to generate slug flows. Because the slug flow regime is characterized by regularly sized and spaced droplets, it is commonly used in low-volume/high-throughput assays of aqueous and organic phases. This scheme is now being applied to high-temperature molten salts in combination with a second fluid that is stable at elevated temperatures. The microchip systems are being tested to determine the channel geometries and absolute and relative phase flow rates required to achieve stable slug flow. Because imaging is difficult at the 5000 C process temperatures the fluorescence of salt ions under ultraviolet illumination is used to discern flow regimes. As molten chloride melts are optically transparent, UV-visible light spectroscopy is also being explored as a spectroscopic technique for integration with at-line microchannel systems to overcome some of the current challenges to in situ analysis. A second technique that is amenable to droplet analysis is Laser-induced Breakdown Spectroscopy (LIBS). A pneumatic droplet generator is being interfaced with a LIBS system for analysis of molten salts at near-process temperatures. Tests of the pneumatic generator are being run using water and molten salts, and in tandem with off-line analysis of the salt droplets with a LIBS spectrometer. (author)

  14. Progress of the DUPIC Fuel Compatibility Analysis (IV) - Fuel Performance

    International Nuclear Information System (INIS)

    Choi, Hang Bok; Ryu, Ho Jin; Roh, Gyu Hong; Jeong, Chang Joon; Park, Chang Je; Song, Kee Chan; Lee, Jung Won

    2005-10-01

    This study describes the mechanical compatibility of the direct use of spent pressurized water reactor (PWR) fuel in Canada deuterium uranium (CANDU) reactors (DUPIC) fuel, when it is loaded into a CANDU reactor. The mechanical compatibility can be assessed for the fuel management, primary heat transport system, fuel channel, and the fuel handling system in the reactor core by both the experimental and analytic methods. Because the physical dimensions of the DUPIC fuel bundle adopt the CANDU flexible (CANFLEX) fuel bundle design which has already been demonstrated for a commercial use in CANDU reactors, the experimental compatibility analyses focused on the generation of material property data and the irradiation tests of the DUPIC fuel, which are used for the computational analysis. The intermediate results of the mechanical compatibility analysis have shown that the integrity of the DUPIC fuel is mostly maintained under the high power and high burnup conditions even though some material properties like the thermal conductivity is a little lower compared to the uranium fuel. However it is required to slightly change the current DUPIC fuel design to accommodate the high internal pressure of the fuel element. It is also strongly recommended to perform more irradiation tests of the DUPIC fuel to accumulate a database for the demonstration of the DUPIC fuel performance in the CANDU reactor

  15. Progress in studies of sex determination mechanisms and sex control techniques in Cynoglossus semilaevis (half-smooth tongue sole

    Directory of Open Access Journals (Sweden)

    Qian ZHOU,Songlin CHEN

    2016-06-01

    Full Text Available The Cynoglossus semilaevis (half-smooth tongue sole is a marine flatfish of great commercial value for fisheries and aquaculture in China. It has a female heterogametic sex determination system (ZW/ZZ and environmental factors can induce sex-reversal of females to phenotypic males, suggesting that it is a promising model for the study of sex determination mechanisms. Additionally, females grow much faster than males and it is feasible to improve the aquaculture production through sex control techniques. This paper reviews the progress in research on sex determination mechanisms research in our laboratory. We have completed whole-genome sequencing and revealed the genome organization and sex chromosome evolution of C. semilaevis. A putative male determining gene dmrt1 was identified and DNA methylation was verified as having a crucial role in the sex reversal process. Genetic maps and sex-specific biomarkers have been used in a marker-assisted selection breeding program and for differentiation of the fish sex. Development and improvement of sex control technologies, including artificial gynogenesis and production of breeding fry with high proportion of females, is also reviewed. These research advances have provided insight into the regulation of sex determination and enabled efficient sex management in artificial culturing of C. semilaevis.

  16. Application of unsupervised analysis techniques to lung cancer patient data.

    Science.gov (United States)

    Lynch, Chip M; van Berkel, Victor H; Frieboes, Hermann B

    2017-01-01

    This study applies unsupervised machine learning techniques for classification and clustering to a collection of descriptive variables from 10,442 lung cancer patient records in the Surveillance, Epidemiology, and End Results (SEER) program database. The goal is to automatically classify lung cancer patients into groups based on clinically measurable disease-specific variables in order to estimate survival. Variables selected as inputs for machine learning include Number of Primaries, Age, Grade, Tumor Size, Stage, and TNM, which are numeric or can readily be converted to numeric type. Minimal up-front processing of the data enables exploring the out-of-the-box capabilities of established unsupervised learning techniques, with little human intervention through the entire process. The output of the techniques is used to predict survival time, with the efficacy of the prediction representing a proxy for the usefulness of the classification. A basic single variable linear regression against each unsupervised output is applied, and the associated Root Mean Squared Error (RMSE) value is calculated as a metric to compare between the outputs. The results show that self-ordering maps exhibit the best performance, while k-Means performs the best of the simpler classification techniques. Predicting against the full data set, it is found that their respective RMSE values (15.591 for self-ordering maps and 16.193 for k-Means) are comparable to supervised regression techniques, such as Gradient Boosting Machine (RMSE of 15.048). We conclude that unsupervised data analysis techniques may be of use to classify patients by defining the classes as effective proxies for survival prediction.

  17. The analysis of composite laminated beams using a 2D interpolating meshless technique

    Science.gov (United States)

    Sadek, S. H. M.; Belinha, J.; Parente, M. P. L.; Natal Jorge, R. M.; de Sá, J. M. A. César; Ferreira, A. J. M.

    2018-02-01

    Laminated composite materials are widely implemented in several engineering constructions. For its relative light weight, these materials are suitable for aerospace, military, marine, and automotive structural applications. To obtain safe and economical structures, the modelling analysis accuracy is highly relevant. Since meshless methods in the recent years achieved a remarkable progress in computational mechanics, the present work uses one of the most flexible and stable interpolation meshless technique available in the literature—the Radial Point Interpolation Method (RPIM). Here, a 2D approach is considered to numerically analyse composite laminated beams. Both the meshless formulation and the equilibrium equations ruling the studied physical phenomenon are presented with detail. Several benchmark beam examples are studied and the results are compared with exact solutions available in the literature and the results obtained from a commercial finite element software. The results show the efficiency and accuracy of the proposed numeric technique.

  18. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    Science.gov (United States)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  19. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Lubna Moin

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  20. Progress and challenges in the development and qualification of multi-level multi-physics coupled methodologies for reactor analysis

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.

    2007-01-01

    Current trends in nuclear power generation and regulation as well as the design of next generation reactor concepts along with the continuing computer technology progress stimulate the development, qualification and application of multi-physics multi-scale coupled code systems. The efforts have been focused on extending the analysis capabilities by coupling models, which simulate different phenomena or system components, as well as on refining the scale and level of detail of the coupling. This paper reviews the progress made in this area and outlines the remaining challenges. The discussion is illustrated with examples based on neutronics/thermohydraulics coupling in the reactor core modeling. In both fields recent advances and developments are towards more physics-based high-fidelity simulations, which require implementation of improved and flexible coupling methodologies. First, the progresses in coupling of different physics codes along with the advances in multi-level techniques for coupled code simulations are discussed. Second, the issues related to the consistent qualification of coupled multi-physics and multi-scale code systems for design and safety evaluation are presented. The increased importance of uncertainty and sensitivity analysis are discussed along with approaches to propagate the uncertainty quantification between the codes. The incoming OECD LWR Uncertainty Analysis in Modeling (UAM) benchmark is the first international activity to address this issue and it is described in the paper. Finally, the remaining challenges with multi-physics coupling are outlined. (authors)

  1. Progress on Radiochemical Analysis for Nuclear Waste Management in Decommissioning

    DEFF Research Database (Denmark)

    Hou, Xiaolin; Qiao, Jixin; Shi, Keliang

    With the increaed numbers of nuclear facilities have been closed and are being or are going to be decommissioned, it is required to characterise the produced nuclear waste for its treatment by identification of the radionuclides and qualitatively determine them. Of the radionuclides related...... separation of radionuclides. In order to improve and maintain the Nodic competence in analysis of radionculides in waste samples, a NKS B project on this topic was launched in 2009. During the first phase of the NKS-B RadWaste project (2009-2010), a good achivement has been reached on establishment...... of collaboration, identifing the requirements from the Nordic nuclear industries and optimizing and development of some analytical methods (Hou et al. NKS-222, 2010). In the year 2011, this project (NKS-B RadWaste2011) continued. The major achievements of this project in 2011 include: (1) development of a method...

  2. Research progress and hotspot analysis of spatial interpolation

    Science.gov (United States)

    Jia, Li-juan; Zheng, Xin-qi; Miao, Jin-li

    2018-02-01

    In this paper, the literatures related to spatial interpolation between 1982 and 2017, which are included in the Web of Science core database, are used as data sources, and the visualization analysis is carried out according to the co-country network, co-category network, co-citation network, keywords co-occurrence network. It is found that spatial interpolation has experienced three stages: slow development, steady development and rapid development; The cross effect between 11 clustering groups, the main convergence of spatial interpolation theory research, the practical application and case study of spatial interpolation and research on the accuracy and efficiency of spatial interpolation. Finding the optimal spatial interpolation is the frontier and hot spot of the research. Spatial interpolation research has formed a theoretical basis and research system framework, interdisciplinary strong, is widely used in various fields.

  3. Advanced grazing-incidence techniques for modern soft-matter materials analysis

    Directory of Open Access Journals (Sweden)

    Alexander Hexemer

    2015-01-01

    Full Text Available The complex nano-morphology of modern soft-matter materials is successfully probed with advanced grazing-incidence techniques. Based on grazing-incidence small- and wide-angle X-ray and neutron scattering (GISAXS, GIWAXS, GISANS and GIWANS, new possibilities arise which are discussed with selected examples. Due to instrumental progress, highly interesting possibilities for local structure analysis in this material class arise from the use of micro- and nanometer-sized X-ray beams in micro- or nanofocused GISAXS and GIWAXS experiments. The feasibility of very short data acquisition times down to milliseconds creates exciting possibilities for in situ and in operando GISAXS and GIWAXS studies. Tuning the energy of GISAXS and GIWAXS in the soft X-ray regime and in time-of flight GISANS allows the tailoring of contrast conditions and thereby the probing of more complex morphologies. In addition, recent progress in software packages, useful for data analysis for advanced grazing-incidence techniques, is discussed.

  4. Dependency Coefficient in Computerized GALS Examination Utilizing Motion Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Hamed Shahidian

    2013-04-01

    Full Text Available Objectives: The GALS (Gait, Arms, Legs and Spine examination is a compact version of standard procedures used by rheumatologists to determine musculoskeletal disorders in patients. Computerization of such a clinical procedure is necessary to ensure an objective evaluation. This article presents the first steps in such an approach by outlining a procedure to use motion analysis techniques as a new method for GALS examination. Methods: A 3D motion pattern was obtained from two subject groups using a six camera motion analysis system. The range of motion associated with GALS was consequently determined using a MATLAB program. Results: The range of motion (ROM of the two subject groups was determined, the validity of the approach was outlined, and the symmetry of movement on both sides of the body was quantified through introduction of a dependency coefficient. Discussion: Analysis of GALS examination and diagnosis of musculoskeletal problems could be addressed more accurately and reliably by adopting motion analysis techniques. Furthermore, introduction of a dependency coefficient offers a wide spectrum of prospective applications in neuromuscular studies .

  5. Electroencephalographic Data Analysis With Visibility Graph Technique for Quantitative Assessment of Brain Dysfunction.

    Science.gov (United States)

    Bhaduri, Susmita; Ghosh, Dipak

    2015-07-01

    Usual techniques for electroencephalographic (EEG) data analysis lack some of the important properties essential for quantitative assessment of the progress of the dysfunction of the human brain. EEG data are essentially nonlinear and this nonlinear time series has been identified as multi-fractal in nature. We need rigorous techniques for such analysis. In this article, we present the visibility graph as the latest, rigorous technique that can assess the degree of multifractality accurately and reliably. Moreover, it has also been found that this technique can give reliable results with test data of comparatively short length. In this work, the visibility graph algorithm has been used for mapping a time series-EEG signals-to a graph to study complexity and fractality of the time series through investigation of its complexity. The power of scale-freeness of visibility graph has been used as an effective method for measuring fractality in the EEG signal. The scale-freeness of the visibility graph has also been observed after averaging the statistically independent samples of the signal. Scale-freeness of the visibility graph has been calculated for 5 sets of EEG data patterns varying from normal eye closed to epileptic. The change in the values is analyzed further, and it has been observed that it reduces uniformly from normal eye closed to epileptic. © EEG and Clinical Neuroscience Society (ECNS) 2014.

  6. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  7. Analysis techniques for two-dimensional infrared data

    Science.gov (United States)

    Winter, E. M.; Smith, M. C.

    1978-01-01

    In order to evaluate infrared detection and remote sensing systems, it is necessary to know the characteristics of the observational environment. For both scanning and staring sensors, the spatial characteristics of the background may be more of a limitation to the performance of a remote sensor than system noise. This limitation is the so-called spatial clutter limit and may be important for systems design of many earth application and surveillance sensors. The data used in this study is two dimensional radiometric data obtained as part of the continuing NASA remote sensing programs. Typical data sources are the Landsat multi-spectral scanner (1.1 micrometers), the airborne heat capacity mapping radiometer (10.5 - 12.5 micrometers) and various infrared data sets acquired by low altitude aircraft. Techniques used for the statistical analysis of one dimensional infrared data, such as power spectral density (PSD), exceedance statistics, etc. are investigated for two dimensional applicability. Also treated are two dimensional extensions of these techniques (2D PSD, etc.), and special techniques developed for the analysis of 2D data.

  8. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  9. Impact during equine locomotion: techniques for measurement and analysis.

    Science.gov (United States)

    Burn, J F; Wilson, A; Nason, G P

    1997-05-01

    Impact is implicated in the development of several types of musculoskeletal injury in the horse. Characterisation of impact experienced during strenuous exercise is an important first step towards understanding the mechanism for injury. Measurement and analysis of large, short duration impacts is difficult. The measurement system must be able to record transient peaks and high frequencies accurately. The analysis technique must be able to characterise the impact signal in time and frequency. This paper presents a measurement system and analysis technique for the characterisation of large impacts. A piezo-electric accelerometer was securely mounted on the dorsal surface of the horses hoof. Saddle mounted charge amplifiers and a 20 m coaxial cable transferred these data to a PC based logging system. Data were down-loaded onto a UNIX workstation and analysed using a proprietary statistics package. The values of parameters calculated from the time series data were comparable to those of other authors. A wavelet decomposition showed that the frequency profile of the signal changed with time. While most spectral energy was seen at impact, a significant amount of energy was contained in the signal immediately following impact. Over 99% of this energy was contained in frequencies less than 1250 Hz. The sampling rate and the frequency response of a measurement system for recording impact should be chosen carefully to prevent loss or corruption of data. Time scale analysis using a wavelet decomposition is a powerful technique which can be used to characterise impact data. The use of contour plots provides a highly visual representation of the time and frequency localisation of power during impact.

  10. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  11. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  12. Analysis of diatomaceous earth by x-ray fluorescence techniques

    International Nuclear Information System (INIS)

    Parker, J.

    1985-01-01

    The use of diatomaceous earth in industry as filtering aids, mineral fillers, catalyst carriers, chromatographic supports, and paint additives is well documented. The diatomite matrix is well suited to x-ray analysis, but this application has not been cited in the literature. In our laboratory, x-ray fluorescence spectrometry has been used to support the analytical needs of diatomite product development. Lithium borate fusion and pressed powder techniques have been used to determine major, minor, and trace elements in diatomite and synthetic silicate samples. Conventional matrix correction models and fundamental parameters have been used to reduce x-ray measurements to accurate chemical analyses. Described are sample and standard preparation techniques, data reduction methods, applications, and results

  13. Some problems of calibration technique in charged particle activation analysis

    International Nuclear Information System (INIS)

    Krasnov, N.N.; Zatolokin, B.V.; Konstantinov, I.O.

    1977-01-01

    It is shown that three different approaches to calibration technique based on the use of average cross-section, equivalent target thickness and thick target yield are adequate. Using the concept of thick target yield, a convenient charged particle activation equation is obtained. The possibility of simultaneous determination of two impurities, from which the same isotope is formed, is pointed out. The use of the concept of thick target yield facilitates the derivation of a simple formula for an absolute and comparative methods of analysis. The methodical error does not exceed 10%. Calibration technique and determination of expected sensitivity based on the thick target yield concept is also very convenient because experimental determination of thick target yield values is a much simpler procedure than getting activation curve or excitation function. (T.G.)

  14. 1985. Annual progress report

    International Nuclear Information System (INIS)

    1986-01-01

    This annual progress report of the CEA Protection and Nuclear Safety Institut outlines a description of the progress made in each sections of the Institut Research activities of the different departments include: reactor safety analysis, fuel cycle facilities analysis; and associated safety research programs (criticality, sites, transport ...), radioecology and environmental radioprotection techniques; data acquisition on radioactive waste storage sites; radiation effects on man, studies on radioprotection techniques; nuclear material security including security of facilities, security of nuclear material transport, and monitoring of nuclear material management; nuclear facility decommissioning; and finally the public information [fr

  15. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  16. Single Particle Tracking: Analysis Techniques for Live Cell Nanoscopy

    Science.gov (United States)

    Relich, Peter Kristopher, II

    Single molecule experiments are a set of experiments designed specifically to study the properties of individual molecules. It has only been in the last three decades where single molecule experiments have been applied to the life sciences; where they have been successfully implemented in systems biology for probing the behaviors of sub-cellular mechanisms. The advent and growth of super-resolution techniques in single molecule experiments has made the fundamental behaviors of light and the associated nano-probes a necessary concern amongst life scientists wishing to advance the state of human knowledge in biology. This dissertation disseminates some of the practices learned in experimental live cell microscopy. The topic of single particle tracking is addressed here in a format that is designed for the physicist who embarks upon single molecule studies. Specifically, the focus is on the necessary procedures to generate single particle tracking analysis techniques that can be implemented to answer biological questions. These analysis techniques range from designing and testing a particle tracking algorithm to inferring model parameters once an image has been processed. The intellectual contributions of the author include the techniques in diffusion estimation, localization filtering, and trajectory associations for tracking which will all be discussed in detail in later chapters. The author of this thesis has also contributed to the software development of automated gain calibration, live cell particle simulations, and various single particle tracking packages. Future work includes further evaluation of this laboratory's single particle tracking software, entropy based approaches towards hypothesis validations, and the uncertainty quantification of gain calibration.

  17. Analysis of Lipoasiprated Following Centrifugation: Wet Versus Dry Harvesting Technique.

    Science.gov (United States)

    Agostini, Tommaso; Spinelli, Giuseppe; Perello, Raffella; Bani, Daniele; Boccalini, Giulia

    2016-09-01

    The success of lipotransfer strongly depends on the harvesting, processing, and placement of the lipoaspirated samples. This study was designed to assess the histomorphometric characteristics and viability of fat harvested using different techniques (wet and dry) following centrifugation, as described by Coleman. The study enrolled 85 consecutive, nonrandomized, healthy patients from March 2010 to December 2014 (45 males and 40 females). The mean age was 40 years (range, 18-59 years), and the mean body mass index was 25.8 (range, 24-32). The authors performed a histological analysis (hematoxylin/eosin), morphometry (ImageJ 1.33 free-share image analysis software), and a viability assessment (Trypan Blue exclusion test; Sigma-Aldrich, Milan, Italy) of the lipoaspirated samples. The hematoxylin and eosin-stained sections exhibited similar features; in particular, clear-cut morphological signs of adipocyte disruption, apoptosis, or necrosis were not detected in the examined samples. Morphometry confirmed the visual findings, and the values of the mean surface area of the adipocyte vacuoles were not significantly different. Additionally, the adipocyte viability was not significantly different in the analyzed fat tissue samples. The results from this study showed, for the first time, that there is not a reduction in the viability of fat grafts harvested with the dry or wet technique following centrifugation according to Coleman technique. Both methods of fat harvesting collect viable cells, which are not influenced by standard centrifugation. The fat grafts harvested and processed by this technique could be used in clinical settings without increasing the reabsorption rate. V.

  18. Comparison of chromosome analysis using cell culture by coverslip technique with flask technique.

    Science.gov (United States)

    Sajapala, Suraphan; Buranawut, Kitti; NiwatArunyakasemsuk, Md

    2014-02-01

    To determine accuracy rate ofchromosome study from amniotic cellculture by coverslip technique compared with flask technique and to compared timing ofamniotic cell culture, amount ofamniotic cell culture media and cost ofamniotic cell culture. Cross sectional study. Department of Obstetrics and Gynecology, Phramongkutklao Hospital. Subjects: 70 pregnant women who underwent amniocentesis at Phramongkutklao Hospital during November 1, 2007 to February 29, 2008. Amniotic cell culture by flask technique and coverslip technique. Accuracy of amniotic cell culture for chromosome study by coverslip technique compared with flask technique. Totally 70 pregnant women who underwent to amniocentesis and dividedamniotic fluid to cell culture by flask technique and coverslip technique. 69 samples had similar resultfrom both techniques. The only one sample had cell culture failure inboth methods due to blood contamination. Accuracy in coverslip technique was 100% compared with flask technique. In timing of amniotic cell culture, amount ofamniotic cell culture media and cost of amniotic cell culture between 2 methods that coverslip technique was lesser than flask technique. There is statistically significant of accuracy in chromosome result between coverslip technique and flask technique. Coverslip technique was lesser than flask technique in timing, amniotic cell culture media and costs ofamniotic cell culture.

  19. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  20. Acceleration of multivariate analysis techniques in TMVA using GPUs

    CERN Document Server

    Hoecker, A; Therhaag, J; Washbrook, A

    2012-01-01

    A feasibility study into the acceleration of multivariate analysis techniques using Graphics Processing Units (GPUs) will be presented. The MLP-based Artificial Neural Network method contained in the TMVA framework has been chosen as a focus for investigation. It was found that the network training time on a GPU was lower than for CPU execution as the complexity of the network was increased. In addition, multiple neural networks can be trained simultaneously on a GPU within the same time taken for single network training on a CPU. This could be potentially leveraged to provide a qualitative performance gain in data classification.

  1. Reduction and analysis techniques for infrared imaging data

    Science.gov (United States)

    Mccaughrean, Mark

    1989-01-01

    Infrared detector arrays are becoming increasingly available to the astronomy community, with a number of array cameras already in use at national observatories, and others under development at many institutions. As the detector technology and imaging instruments grow more sophisticated, more attention is focussed on the business of turning raw data into scientifically significant information. Turning pictures into papers, or equivalently, astronomy into astrophysics, both accurately and efficiently, is discussed. Also discussed are some of the factors that can be considered at each of three major stages; acquisition, reduction, and analysis, concentrating in particular on several of the questions most relevant to the techniques currently applied to near infrared imaging.

  2. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  3. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  4. Does smoking reduce the progression of osteoarthritis? Meta-analysis of observational studies.

    Science.gov (United States)

    Pearce, Fiona; Hui, Michelle; Ding, Changhai; Doherty, Michael; Zhang, Weiya

    2013-07-01

    To determine whether smoking reduces the progression of osteoarthritis (OA). Observational studies examining smoking and progression of OA were systematically searched through Medline (1948-), EMBase (1980-), Web of Science, PubMed, and Google and relevant references. The search was last updated in May 2012. Odds ratios (ORs) and 95% confidence intervals (95% CIs) were directly retrieved or calculated. Current standards for reporting meta-analyses of observational studies (Meta-Analysis of Observational Studies in Epidemiology) were followed. Quality-related aspects such as study design, setting, sample selection, definition of progression, and confounding bias were recorded. Stratified and meta-regression analyses were undertaken to examine the covariates. Sixteen studies (976,564 participants) were identified from the literature. Overall, there was no significant association between smoking and progression of OA (OR 0.92; 95% CI 0.83, 1.02). There was moderate heterogeneity of results (I(2) = 57.3%, P = 0.0024). Subgroup analyses showed some associations of marginal significance; however, meta-regression did not confirm any significant results. There is no compelling evidence that smoking has a protective effect on the progression of OA. The results concur with a previous meta-analysis published by this group that showed no association between smoking and incidence of OA. Taken together, smoking does not appear to reduce either the incidence or progression of OA. Copyright © 2013 by the American College of Rheumatology.

  5. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  6. Envelopment technique and topographic overlays in bite mark analysis.

    Science.gov (United States)

    Djeapragassam, Parimala; Daniel, Mariappan Jonathan; Srinivasan, Subramanian Vasudevan; Ramadoss, Koliyan; Jimsha, Vannathan Kumaran

    2015-01-01

    The aims and objectives of our study were to compare four sequential overlays generated using the envelopment technique and to evaluate inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Dental stone models were prepared from impressions made from healthy individuals; photographs were taken and computer-assisted overlays were generated. The models were then enveloped in a different-color dental stone. After this, four sequential cuts were made at a thickness of 1mm each. Each sectional cut was photographed and overlays were generated. Thus, 125 overlays were generated and compared. The scoring was done based on matching accuracy and the data were analyzed. The Kruskal-Wallis one-way analysis of variance (ANOVA) test was used to compare four sequential overlays and Spearman's rank correlation tests were used to evaluate the inter- and intraoperator reliability of the overlays obtained by the envelopment technique. Through our study, we conclude that the third and fourth cuts were the best among the four cuts and inter- and intraoperator reliability were found to be statistically significant at 5% level that is 95% confidence interval (P < 0.05).

  7. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  8. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces......Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...

  9. Mechanisms of subsidence for induced damage and techniques for analysis

    International Nuclear Information System (INIS)

    Drumm, E.C.; Bennett, R.M.; Kane, W.F.

    1988-01-01

    Structural damage due to mining induced subsidence is a function of the nature of the structure and its position on the subsidence profile. A point on the profile may be in the tensile zone, the compressive zone, or the no-deformation zone at the bottom of the profile. Damage to structures in the tension zone is primarily due to a reduction of support during vertical displacement of the ground surface, and to shear stresses between the soil and structure resulting from horizontal displacements. The damage mechanisms due to tension can be investigated effectively using a two-dimensional plane stress analysis. Structures in the compression zone are subjected to positive moments in the footing and large compressive horizontal stresses in the foundation walls. A plane strain analysis of the foundation wall is utilized to examine compression zone damage mechanisms. The structural aspects affecting each mechanism are identified and potential mitigation techniques are summarized

  10. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  11. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four...... aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection...... the use of short columns. Hence, analysis times could be halved without loss of separation efficiency in this biological sample...

  12. Evaluation of Progressive Failure Analysis and Modeling of Impact Damage in Composite Pressure Vessels

    Science.gov (United States)

    Sanchez, Christopher M.

    2011-01-01

    NASA White Sands Test Facility (WSTF) is leading an evaluation effort in advanced destructive and nondestructive testing of composite pressure vessels and structures. WSTF is using progressive finite element analysis methods for test design and for confirmation of composite pressure vessel performance. Using composite finite element analysis models and failure theories tested in the World-Wide Failure Exercise, WSTF is able to estimate the static strength of composite pressure vessels. Additionally, test and evaluation on composites that have been impact damaged is in progress so that models can be developed to estimate damage tolerance and the degradation in static strength.

  13. Performance of confocal scanning laser tomograph Topographic Change Analysis (TCA) for assessing glaucomatous progression.

    Science.gov (United States)

    Bowd, Christopher; Balasubramanian, Madhusudhanan; Weinreb, Robert N; Vizzeri, Gianmarco; Alencar, Luciana M; O'Leary, Neil; Sample, Pamela A; Zangwill, Linda M

    2009-02-01

    To determine the sensitivity and specificity of confocal scanning laser ophthalmoscope's Topographic Change Analysis (TCA; Heidelberg Retina Tomograph [HRT]; Heidelberg Engineering, Heidelberg, Germany) parameters for discriminating between progressing glaucomatous and stable healthy eyes. The 0.90, 0.95, and 0.99 specificity cutoffs for various (n=70) TCA parameters were developed by using 1000 permuted topographic series derived from HRT images of 18 healthy eyes from Moorfields Eye Hospital, imaged at least four times. The cutoffs were then applied to topographic series from 36 eyes with known glaucomatous progression (by optic disc stereophotograph assessment and/or standard automated perimetry guided progression analysis, [GPA]) and 21 healthy eyes from the University of California, San Diego (UCSD) Diagnostic Innovations in Glaucoma Study (DIGS), all imaged at least four times, to determine TCA sensitivity and specificity. Cutoffs also were applied to 210 DIGS patients' eyes imaged at least four times with no evidence of progression (nonprogressed) by stereophotography or GPA. The TCA parameter providing the best sensitivity/specificity tradeoff using the 0.90, 0.95, and 0.99 cutoffs was the largest clustered superpixel area within the optic disc margin (CAREA(disc) mm(2)). Sensitivities/specificities for classifying progressing (by stereophotography and/or GPA) and healthy eyes were 0.778/0.809, 0.639/0.857, and 0.611/1.00, respectively. In nonprogressing eyes, specificities were 0.464, 0.570, and 0.647 (i.e., lower than in the healthy eyes). In addition, TCA parameter measurements of nonprogressing eyes were similar to those of progressing eyes. TCA parameters can discriminate between progressing and longitudinally observed healthy eyes. Low specificity in apparently nonprogressing patients' eyes suggests early progression detection using TCA.

  14. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  15. Analysis of minor phase with neutron diffraction technique

    International Nuclear Information System (INIS)

    Engkir Sukirman; Herry Mugirahardjo

    2014-01-01

    The presence of minor phases in a sample have been analyzed with the neutron diffraction technique. In this research, the sample of Fe nanoparticles (FNP) has been selected as the object of case study. The first step was to prepare the FNP sample with the ball milling technique. Hereinafter, the sample of milling result was referred FIC2. The presence of phases formed in FIC2 were analyzed qualitatively and quantitatively using the high resolution neutron diffraction (HRPD ) and X-Ray Diffraction (XRD) techniques. The diffraction data were analyzed by means of the Rietveld method utilizing a computer code, namely FullProf and performed by referring to the supporting data, namely particle size and magnetic properties of materials. The two kinds of supporting data were obtained from the PSA (Particles Size Analyzer) and VSM (Vibrating Samples Magnetometer), respectively. The analysis result shows that quality of fitting for neutron diffraction pattern is better than the fitting quality for x-ray diffraction pattern. Of the HRPD data were revealed that FIC2 consists of Fe, γFe 2 O 3 and Fe 3 O 4 phases as much as 78.62; 21.37 and 0.01%, respectively. Of the XRD data were obtained that FIC2 consists of Fe and γFe 2 O 3 phases with amount of 99.96 and 0.04%, respectively; the presence of Fe 3 O 4 phase was not observed. With the neutron diffraction technique, the presence of minor phase can be determined accurately. (author)

  16. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  17. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  18. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  19. Maintenance Audit through Value Analysis Technique: A Case Study

    Science.gov (United States)

    Carnero, M. C.; Delgado, S.

    2008-11-01

    The increase in competitiveness, technological changes and the increase in the requirements of quality and service have forced a change in the design and application of maintenance, as well as the way in which it is considered within the managerial strategy. There are numerous maintenance activities that must be developed in a service company. As a result the maintenance functions as a whole have to be outsourced. Nevertheless, delegating this subject to specialized personnel does not exempt the company from responsibilities, but rather leads to the need for control of each maintenance activity. In order to achieve this control and to evaluate the efficiency and effectiveness of the company it is essential to carry out an audit that diagnoses the problems that could develop. In this paper a maintenance audit applied to a service company is developed. The methodology applied is based on the expert systems. The expert system by means of rules uses the weighting technique SMART and value analysis to obtain the weighting between the decision functions and between the alternatives. The expert system applies numerous rules and relations between different variables associated with the specific maintenance functions, to obtain the maintenance state by sections and the general maintenance state of the enterprise. The contributions of this paper are related to the development of a maintenance audit in a service enterprise, in which maintenance is not generally considered a strategic subject and to the integration of decision-making tools such as the weighting technique SMART with value analysis techniques, typical in the design of new products, in the area of the rule-based expert systems.

  20. Progress towards an unassisted element identification from Laser Induced Breakdown Spectra with automatic ranking techniques inspired by text retrieval

    International Nuclear Information System (INIS)

    Amato, G.; Cristoforetti, G.; Legnaioli, S.; Lorenzetti, G.; Palleschi, V.; Sorrentino, F.; Tognoni, E.

    2010-01-01

    In this communication, we will illustrate an algorithm for automatic element identification in LIBS spectra which takes inspiration from the vector space model applied to text retrieval techniques. The vector space model prescribes that text documents and text queries are represented as vectors of weighted terms (words). Document ranking, with respect to relevance to a query, is obtained by comparing the vectors representing the documents with the vector representing the query. In our case, we represent elements and samples as vectors of weighted peaks, obtained from their spectra. The likelihood of the presence of an element in a sample is computed by comparing the corresponding vectors of weighted peaks. The weight of a peak is proportional to its intensity and to the inverse of the number of peaks, in the database, in its wavelength neighboring. We suppose to have a database containing the peaks of all elements we want to recognize, where each peak is represented by a wavelength and it is associated with its expected relative intensity and the corresponding element. Detection of elements in a sample is obtained by ranking the elements according to the distance of the associated vectors from the vector representing the sample. The application of this approach to elements identification using LIBS spectra obtained from several kinds of metallic alloys will be also illustrated. The possible extension of this technique towards an algorithm for fully automated LIBS analysis will be discussed.

  1. Limited vs extended face-lift techniques: objective analysis of intraoperative results.

    Science.gov (United States)

    Litner, Jason A; Adamson, Peter A

    2006-01-01

    To compare the intraoperative outcomes of superficial musculoaponeurotic system plication, imbrication, and deep-plane rhytidectomy techniques. Thirty-two patients undergoing primary deep-plane rhytidectomy participated. Each hemiface in all patients was submitted sequentially to 3 progressively more extensive lifts, while other variables were standardized. Four major outcome measures were studied, including the extent of skin redundancy and the repositioning of soft tissues along the malar, mandibular, and cervical vectors of lift. The amount of skin excess was measured without tension from the free edge to a point over the intertragal incisure, along a plane overlying the jawline. Using a soft tissue caliper, repositioning was examined by measurement of preintervention and immediate postintervention distances from dependent points to fixed anthropometric reference points. The mean skin excesses were 10.4, 12.8, and 19.4 mm for the plication, imbrication, and deep-plane lifts, respectively. The greatest absolute soft tissue repositioning was noted along the jawline, with the least in the midface. Analysis revealed significant differences from baseline and between lift types for each of the studied techniques in each of the variables tested. These data support the use of the deep-plane rhytidectomy technique to achieve a superior intraoperative lift relative to comparator techniques.

  2. Progression Analysis and Stage Discovery in Continuous Physiological Processes Using Image Computing

    Directory of Open Access Journals (Sweden)

    Ferrucci Luigi

    2010-01-01

    Full Text Available We propose an image computing-based method for quantitative analysis of continuous physiological processes that can be sensed by medical imaging and demonstrate its application to the analysis of morphological alterations of the bone structure, which correlate with the progression of osteoarthritis (OA. The purpose of the analysis is to quantitatively estimate OA progression in a fashion that can assist in understanding the pathophysiology of the disease. Ultimately, the texture analysis will be able to provide an alternative OA scoring method, which can potentially reflect the progression of the disease in a more direct fashion compared to the existing clinically utilized classification schemes based on radiology. This method can be useful not just for studying the nature of OA, but also for developing and testing the effect of drugs and treatments. While in this paper we demonstrate the application of the method to osteoarthritis, its generality makes it suitable for the analysis of other progressive clinical conditions that can be diagnosed and prognosed by using medical imaging.

  3. Evaluation of energy system analysis techniques for identifying underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    VanKuiken, J.C.; Kavicky, J.A.; Portante, E.C. [and others

    1996-03-01

    This report describes the results of a study to determine the feasibility and potential usefulness of applying energy system analysis techniques to help detect and characterize underground facilities that could be used for clandestine activities. Four off-the-shelf energy system modeling tools were considered: (1) ENPEP (Energy and Power Evaluation Program) - a total energy system supply/demand model, (2) ICARUS (Investigation of Costs and Reliability in Utility Systems) - an electric utility system dispatching (or production cost and reliability) model, (3) SMN (Spot Market Network) - an aggregate electric power transmission network model, and (4) PECO/LF (Philadelphia Electric Company/Load Flow) - a detailed electricity load flow model. For the purposes of most of this work, underground facilities were assumed to consume about 500 kW to 3 MW of electricity. For some of the work, facilities as large as 10-20 MW were considered. The analysis of each model was conducted in three stages: data evaluation, base-case analysis, and comparative case analysis. For ENPEP and ICARUS, open source data from Pakistan were used for the evaluations. For SMN and PECO/LF, the country data were not readily available, so data for the state of Arizona were used to test the general concept.

  4. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  5. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Directory of Open Access Journals (Sweden)

    Richard E.A. van Emmerik

    2016-03-01

    Full Text Available Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1 maintain pattern stability, (2 transition into new states, and (3 are governed by short- and long-term (fractal correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  6. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  7. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    Fassbender, L.; Gregory, R.; Winterfeldt, D. von; John, R.

    1992-01-01

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  8. Techniques of DNA methylation analysis with nutritional applications.

    Science.gov (United States)

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests. Copyright © 2013 S. Karger AG, Basel.

  9. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  10. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  11. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C [University of Washington, Seattle; Abgrall, N. [Lawrence Berkeley National Laboratory (LBNL); Arnquist, I. J. [Pacific Northwest National Laboratory (PNNL); Avignone, III, F. T. [University of South Carolina/Oak Ridge National Laboratory (ORNL); Baldenegro-Barrera, C. X. [Oak Ridge National Laboratory (ORNL); Barabash, A.S. [Institute of Theoretical & Experimental Physics (ITEP), Moscow, Russia; Bertrand, F. E. [Oak Ridge National Laboratory (ORNL); Bradley, A. W. [Lawrence Berkeley National Laboratory (LBNL); Brudanin, V. [Joint Institute for Nuclear Research, Dubna, Russia; Busch, M. [Duke University/TUNL; Buuck, M. [University of Washington, Seattle; Byram, D. [University of South Dakota; Caldwell, A. S. [South Dakota School of Mines and Technology; Chan, Y-D [Lawrence Berkeley National Laboratory (LBNL); Christofferson, C. D. [South Dakota School of Mines and Technology; Detwiler, J. A. [University of Washington, Seattle; Efremenko, Yu. [University of Tennessee, Knoxville (UTK); Ejiri, H. [Osaka University, Japan; Elliott, S. R. [Los Alamos National Laboratory (LANL); Galindo-Uribarri, A. [Oak Ridge National Laboratory (ORNL); Gilliss, T. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Giovanetti, G. K. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Goett, J [Los Alamos National Laboratory (LANL); Green, M. P. [Oak Ridge National Laboratory (ORNL); Gruszko, J [University of Washington, Seattle; Guinn, I S [University of Washington, Seattle; Guiseppe, V E [University of South Carolina, Columbia; Henning, R. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Hoppe, E.W. [Pacific Northwest National Laboratory (PNNL); Howard, S. [South Dakota School of Mines and Technology; Howe, M. A. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Jasinski, B R [University of South Dakota; Keeter, K.J. [Black Hills State University, Spearfish, South Dakota; Kidd, M. F. [Tennessee Technological University (TTU); Konovalov, S.I. [Institute of Theoretical & Experimental Physics (ITEP), Moscow, Russia; Kouzes, R. T. [Pacific Northwest National Laboratory (PNNL); LaFerriere, B. D. [Pacific Northwest National Laboratory (PNNL); Leon, J. [University of Washington, Seattle; MacMullin, J. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Martin, R. D. [University of South Dakota; Meijer, S. J. [University of North Carolina / Triangle Universities Nuclear Lababoratory, Durham; Mertens, S. [Lawrence Berkeley National Laboratory (LBNL); Orrell, J. L. [Pacific Northwest National Laboratory (PNNL); O' Shaughnessy, C. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Poon, A.W.P. [Lawrence Berkeley National Laboratory (LBNL); Radford, D. C. [Oak Ridge National Laboratory (ORNL); Rager, J. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Rielage, K. [Los Alamos National Laboratory (LANL); Robertson, R.G.H. [University of Washington, Seattle; Romero-Romero, E. [University of Tennessee, Knoxville, (UTK)/Oak Ridge National Lab (ORNL); Shanks, B. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Shirchenko, M. [Joint Institute for Nuclear Research, Dubna, Russia; Snyder, N [University of South Dakota; Suriano, A. M. [South Dakota School of Mines and Technology; Tedeschi, D [University of South Carolina, Columbia; Trimble, J. E. [Univ. North Carolina-Chapel Hill/Triangle Univ. Nucl. Lab., Durham, NC; Varner, R. L. [Oak Ridge National Laboratory (ORNL); Vasilyev, S. [Joint Institute for Nuclear Research, Dubna, Russia; Vetter, K. [University of California/Lawrence Berkeley National Laboratory (LBNL); et al.

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in Ge-76. In view of the next generation of tonne-scale Ge-based 0 nu beta beta-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  12. Evaluation of tritium analysis techniques for a continuous tritium monitor

    International Nuclear Information System (INIS)

    Fernandez, S.J.; Girton, R.C.

    1978-04-01

    Present methods for tritium monitoring are evaluated and a program is proposed to modify the existing methods or develop new instrumentation to establish a state-of-the-art monitoring capability for nuclear fuel reprocessing plants. The capabilities, advantages, and disadvantages of the most popular counting and separation techniques are described. The following criteria were used to evaluate present methods: specificity, selectivity, precision, insensitivity to gamma radiation, and economy. A novel approach is explored to continuously separate the tritium from a complex mixture of stack gases. This approach, based on the different permeabilities of the stack gas constituents, is integrated into a complete monitoring system. This monitoring system is designed to perform real time tritium analysis. A schedule is presented for development and demonstration of the completed system

  13. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  14. Advances in zymography techniques and patents regarding protease analysis.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2012-08-01

    Detection of enzymatic activity on gel electrophoresis, namely zymography, is a technique that has received increasing attention in the last 10 years, according to the number of articles published. A growing amount of enzymes, mainly proteases, are now routinely detected by zymography. Detailed analytical studies are beginning to be published, as well as new patents have been developed. This new article updates the information covered in our last review, condensing the recent publications dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations. The new advances of this method are basically focused towards two dimensional zymography and transfer zymography. Though comparatively fewer patents have been published, they basically coincide in the study of matrix metalloproteases. The tendency is foreseen to be very productive in the area of zymoproteomics, combining electrophoresis and mass spectrometry for the analysis of proteases.

  15. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  16. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  17. A meta-analysis on progressive atrophy in intractable temporal lobe epilepsy: Time is brain?

    Science.gov (United States)

    Caciagli, Lorenzo; Bernasconi, Andrea; Wiebe, Samuel; Koepp, Matthias J; Bernasconi, Neda; Bernhardt, Boris C

    2017-08-01

    It remains unclear whether drug-resistant temporal lobe epilepsy (TLE) is associated with cumulative brain damage, with no expert consensus and no quantitative syntheses of the available evidence. We conducted a systematic review and meta-analysis of MRI studies on progressive atrophy, searching PubMed and Ovid MEDLINE databases for cross-sectional and longitudinal quantitative MRI studies on drug-resistant TLE. We screened 2,976 records and assessed eligibility of 248 full-text articles. Forty-two articles met the inclusion criteria for quantitative evaluation. We observed a predominance of cross-sectional studies, use of different clinical indices of progression, and high heterogeneity in age-control procedures. Meta-analysis of 18/1 cross-sectional/longitudinal studies on hippocampal atrophy (n = 979 patients) yielded a pooled effect size of r = -0.42 for ipsilateral atrophy related to epilepsy duration (95% confidence interval [CI] -0.51 to -0.32; p atrophy (n = 1,504 patients) indicated that >80% of articles reported duration-related progression in extratemporal cortical and subcortical regions. Detailed analysis of study design features yielded low to moderate levels of evidence for progressive atrophy across studies, mainly due to dominance of cross-sectional over longitudinal investigations, use of diverse measures of seizure estimates, and absence of consistent age control procedures. While the neuroimaging literature is overall suggestive of progressive atrophy in drug-resistant TLE, published studies have employed rather weak designs to directly demonstrate it. Longitudinal multicohort studies are needed to unequivocally differentiate aging from disease progression. © 2017 American Academy of Neurology.

  18. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  19. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    2010-11-01

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  20. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  1. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  2. Crystallographic texture analysis of archaeological metals: interpretation of manufacturing techniques

    International Nuclear Information System (INIS)

    Artioli, G.

    2007-01-01

    Neutron probes and high energy X-rays are sources of primary importance for the non-invasive characterization of materials related to cultural heritage. Their employment in the characterization of archaeological metal objects, combined with the recent instrumental and computational developments in the field of crystallographic texture analysis (CTA) from diffraction data proves to be a powerful tool for the interpretation of ancient metal working techniques. Diffraction based CTA, when performed using penetrating probes and adequate detector coverage of reciprocal space, for example using large detector arrays and/or ToF mode, allows simultaneous identification and quantification of crystalline phases, besides the microstructural and textural characterization of the object, and it can be effectively used as a totally non-invasive tool for metallographic analysis. Furthermore, the chemical composition of the object may also be obtained by the simultaneous detection of prompt gamma rays induced by neutron activation, or by the fluorescence signal from high energy X-rays, in order to obtain a large amount of complementary information in a single experiment. The specific application of neutron CTA to the characterization of the manufacturing processes of prehistoric copper axes is discussed in detail. (orig.)

  3. Crystallographic texture analysis of archaeological metals: interpretation of manufacturing techniques

    Science.gov (United States)

    Artioli, G.

    2007-12-01

    Neutron probes and high energy X-rays are sources of primary importance for the non-invasive characterization of materials related to cultural heritage. Their employment in the characterization of archaeological metal objects, combined with the recent instrumental and computational developments in the field of crystallographic texture analysis (CTA) from diffraction data proves to be a powerful tool for the interpretation of ancient metal working techniques. Diffraction based CTA, when performed using penetrating probes and adequate detector coverage of reciprocal space, for example using large detector arrays and/or ToF mode, allows simultaneous identification and quantification of crystalline phases, besides the microstructural and textural characterization of the object, and it can be effectively used as a totally non-invasive tool for metallographic analysis. Furthermore, the chemical composition of the object may also be obtained by the simultaneous detection of prompt gamma rays induced by neutron activation, or by the fluorescence signal from high energy X-rays, in order to obtain a large amount of complementary information in a single experiment. The specific application of neutron CTA to the characterization of the manufacturing processes of prehistoric copper axes is discussed in detail.

  4. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  5. Analysis of Downs syndrome with molecular techniques for future diagnoses

    Directory of Open Access Journals (Sweden)

    May Salem Al-Nbaheen

    2018-03-01

    Full Text Available Down syndrome (DS is a genetic disorder appeared due to the presence of trisomy in chromosome 21 in the G-group of the acrocentric region. DS is also known as non-Mendelian inheritance, due to the lack of Mendel’s laws. The disorder in children is identified through clinical symptoms and chromosomal analysis and till now there are no biochemical and molecular analyses. Presently, whole exome sequencing (WES has largely contributed in identifying the new disease-causing genes and represented a significant breakthrough in the field of human genetics and this technique uses high throughput sequencing technologies to determine the arrangement of DNA base pairs specifying the protein coding regions of an individual’s genome. Apart from this next generation sequencing and whole genome sequencing also contribute for identifying the disease marker. From this review, the suggestion was to perform the WES is DS children to identify the marker region. Keywords: Downs syndrome, Exome sequencing, Chromosomal analysis, Genes, Genetics

  6. A review of signal processing techniques for heart sound analysis in clinical diagnosis.

    Science.gov (United States)

    Emmanuel, Babatunde S

    2012-08-01

    This paper presents an overview of approaches to analysis of heart sound signals. The paper reviews the milestones in the development of phonocardiogram (PCG) signal analysis. It describes the various stages involved in the analysis of heart sounds and discrete wavelet transform as a preferred method for bio-signal processing. In addition, the gaps that still exist between contemporary methods of signal analysis of heart sounds and their applications for clinical diagnosis is reviewed. A lot of progress has been made but crucial gaps still exist. The findings of this review paper are as follows: there is a lack of consensus in research outputs; inter-patient adaptability of signal processing algorithm is still problematic; the process of clinical validation of analysis techniques was not sufficiently rigorous in most of the reviewed literature; and as such data integrity and measurement are still in doubt, which most of the time led to inaccurate interpretation of results. In addition, the existing diagnostic systems are too complex and expensive. The paper concluded that the ability to correctly acquire, analyse and interpret heart sound signals for improved clinical diagnostic processes has become a priority.

  7. Flexible Multibody Dynamics Finite Element Formulation Applied to Structural Progressive Collapse Analysis

    Directory of Open Access Journals (Sweden)

    Rodolfo André Kuche Sanches

    Full Text Available Abstract This paper presents a two-dimensional frame finite element methodology to deal with flexible multi-body dynamic systems and applies it to building progressive collapse analysis. The proposed methodology employs a frame element with Timoshenko kinematics and the dynamic governing equation is solved based on the stationary potential energy theorem written regarding nodal positions and generalized vectors components instead of displacements and rotations. The bodies are discretized by lose finite elements, which are assembled by Lagrange multipliers in order to make possible dynamical detachment. Due to the absence of rotation, the time integration is carried by classical Newmark algorithm, which reveals to be stable to the position based formulation. The accuracy of the proposed formulation is verified by simple examples and its capabilities regarding progressive collapse analysis is demonstrated in a more complete building analysis.

  8. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  9. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  10. A dynamic mechanical analysis technique for porous media.

    Science.gov (United States)

    Pattison, Adam Jeffry; McGarry, Matthew; Weaver, John B; Paulsen, Keith D

    2015-02-01

    Dynamic mechanical analysis (DMA) is a common way to measure the mechanical properties of materials as functions of frequency. Traditionally, a viscoelastic mechanical model is applied and current DMA techniques fit an analytical approximation to measured dynamic motion data by neglecting inertial forces and adding empirical correction factors to account for transverse boundary displacements. Here, a finite-element (FE) approach to processing DMA data was developed to estimate poroelastic material properties. Frequency-dependent inertial forces, which are significant in soft media and often neglected in DMA, were included in the FE model. The technique applies a constitutive relation to the DMA measurements and exploits a nonlinear inversion to estimate the material properties in the model that best fit the model response to the DMA data. A viscoelastic version of this approach was developed to validate the approach by comparing complex modulus estimates to the direct DMA results. Both analytical and FE poroelastic models were also developed to explore their behavior in the DMA testing environment. All of the models were applied to tofu as a representative soft poroelastic material that is a common phantom in elastography imaging studies. Five samples of three different stiffnesses were tested from 1-14 Hz with rough platens placed on the top and bottom surfaces of the material specimen under test to restrict transverse displacements and promote fluid-solid interaction. The viscoelastic models were identical in the static case, and nearly the same at frequency with inertial forces accounting for some of the discrepancy. The poroelastic analytical method was not sufficient when the relevant physical boundary constraints were applied, whereas the poroelastic FE approach produced high quality estimates of shear modulus and hydraulic conductivity. These results illustrated appropriate shear modulus contrast between tofu samples and yielded a consistent contrast in

  11. Data analysis techniques: a tool for cumulative exposure assessment.

    Science.gov (United States)

    Lalloué, Benoît; Monnez, Jean-Marie; Padilla, Cindy; Kihal, Wahida; Zmirou-Navier, Denis; Deguen, Séverine

    2015-01-01

    Everyone is subject to environmental exposures from various sources, with negative health impacts (air, water and soil contamination, noise, etc.or with positive effects (e.g. green space). Studies considering such complex environmental settings in a global manner are rare. We propose to use statistical factor and cluster analyses to create a composite exposure index with a data-driven approach, in view to assess the environmental burden experienced by populations. We illustrate this approach in a large French metropolitan area. The study was carried out in the Great Lyon area (France, 1.2 M inhabitants) at the census Block Group (BG) scale. We used as environmental indicators ambient air NO2 annual concentrations, noise levels and proximity to green spaces, to industrial plants, to polluted sites and to road traffic. They were synthesized using Multiple Factor Analysis (MFA), a data-driven technique without a priori modeling, followed by a Hierarchical Clustering to create BG classes. The first components of the MFA explained, respectively, 30, 14, 11 and 9% of the total variance. Clustering in five classes group: (1) a particular type of large BGs without population; (2) BGs of green residential areas, with less negative exposures than average; (3) BGs of residential areas near midtown; (4) BGs close to industries; and (5) midtown urban BGs, with higher negative exposures than average and less green spaces. Other numbers of classes were tested in order to assess a variety of clustering. We present an approach using statistical factor and cluster analyses techniques, which seem overlooked to assess cumulative exposure in complex environmental settings. Although it cannot be applied directly for risk or health effect assessment, the resulting index can help to identify hot spots of cumulative exposure, to prioritize urban policies or to compare the environmental burden across study areas in an epidemiological framework.

  12. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  13. Analysis of Muscle Fatigue Progression using Cyclostationary Property of Surface Electromyography Signals.

    Science.gov (United States)

    Karthick, P A; Venugopal, G; Ramakrishnan, S

    2016-01-01

    Analysis of neuromuscular fatigue finds various applications ranging from clinical studies to biomechanics. Surface electromyography (sEMG) signals are widely used for these studies due to its non-invasiveness. During cyclic dynamic contractions, these signals are nonstationary and cyclostationary. In recent years, several nonstationary methods have been employed for the muscle fatigue analysis. However, cyclostationary based approach is not well established for the assessment of muscle fatigue. In this work, cyclostationarity associated with the biceps brachii muscle fatigue progression is analyzed using sEMG signals and Spectral Correlation Density (SCD) functions. Signals are recorded from fifty healthy adult volunteers during dynamic contractions under a prescribed protocol. These signals are preprocessed and are divided into three segments, namely, non-fatigue, first muscle discomfort and fatigue zones. Then SCD is estimated using fast Fourier transform accumulation method. Further, Cyclic Frequency Spectral Density (CFSD) is calculated from the SCD spectrum. Two features, namely, cyclic frequency spectral area (CFSA) and cyclic frequency spectral entropy (CFSE) are proposed to study the progression of muscle fatigue. Additionally, degree of cyclostationarity (DCS) is computed to quantify the amount of cyclostationarity present in the signals. Results show that there is a progressive increase in cyclostationary during the progression of muscle fatigue. CFSA shows an increasing trend in muscle fatiguing contraction. However, CFSE shows a decreasing trend. It is observed that when the muscle progresses from non-fatigue to fatigue condition, the mean DCS of fifty subjects increases from 0.016 to 0.99. All the extracted features found to be distinct and statistically significant in the three zones of muscle contraction (p < 0.05). It appears that these SCD features could be useful in the automated analysis of sEMG signals for different neuromuscular conditions.

  14. Assessing Progress towards Public Health, Human Rights, and International Development Goals Using Frontier Analysis.

    Science.gov (United States)

    Luh, Jeanne; Cronk, Ryan; Bartram, Jamie

    2016-01-01

    Indicators to measure progress towards achieving public health, human rights, and international development targets, such as 100% access to improved drinking water or zero maternal mortality ratio, generally focus on status (i.e., level of attainment or coverage) or trends in status (i.e., rates of change). However, these indicators do not account for different levels of development that countries experience, thus making it difficult to compare progress between countries. We describe a recently developed new use of frontier analysis and apply this method to calculate country performance indices in three areas: maternal mortality ratio, poverty headcount ratio, and primary school completion rate. Frontier analysis is used to identify the maximum achievable rates of change, defined by the historically best-performing countries, as a function of coverage level. Performance indices are calculated by comparing a country's rate of change against the maximum achievable rate at the same coverage level. A country's performance can be positive or negative, corresponding to progression or regression, respectively. The calculated performance indices allow countries to be compared against each other regardless of whether they have only begun to make progress or whether they have almost achieved the target. This paper is the first to use frontier analysis to determine the maximum achievable rates as a function of coverage level and to calculate performance indices for public health, human rights, and international development indicators. The method can be applied to multiple fields and settings, for example health targets such as cessation in smoking or specific vaccine immunizations, and offers both a new approach to analyze existing data and a new data source for consideration when assessing progress achieved.

  15. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  16. User-defined Material Model for Thermo-mechanical Progressive Failure Analysis

    Science.gov (United States)

    Knight, Norman F., Jr.

    2008-01-01

    Previously a user-defined material model for orthotropic bimodulus materials was developed for linear and nonlinear stress analysis of composite structures using either shell or solid finite elements within a nonlinear finite element analysis tool. Extensions of this user-defined material model to thermo-mechanical progressive failure analysis are described, and the required input data are documented. The extensions include providing for temperature-dependent material properties, archival of the elastic strains, and a thermal strain calculation for materials exhibiting a stress-free temperature.

  17. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT): Semi-Annual Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, D N

    2012-02-29

    This report summarizes work carried out by the Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Team for the period of July 1, 2011 through December 31, 2011. It discusses highlights, overall progress, period goals, and collaborations and lists papers and presentations. The UV-CDAT team is positioned to address the following high-level visualization requirements: (1) Alternative parallel streaming statistics and analysis pipelines - Data parallelism, Task parallelism, Visualization parallelism; (2) Optimized parallel input/output (I/O); (3) Remote interactive execution; (4) Advanced intercomparison visualization; (5) Data provenance processing and capture; and (6) Interfaces for scientists - Workflow data analysis and visualization construction tools, Visualization interfaces.

  18. Tracking progress towards global drinking water and sanitation targets: A within and among country analysis.

    Science.gov (United States)

    Fuller, James A; Goldstick, Jason; Bartram, Jamie; Eisenberg, Joseph N S

    2016-01-15

    Global access to safe drinking water and sanitation has improved dramatically during the Millennium Development Goal (MDG) period. However, there is substantial heterogeneity in progress between countries and inequality within countries. We assessed countries' temporal patterns in access to drinking water and sanitation using publicly available data. We then classified countries using non-linear modeling techniques as having one of the following trajectories: 100% coverage, linear growth, linear decline, no change, saturation, acceleration, deceleration, negative acceleration, or negative deceleration. We further assessed the degree to which temporal profiles follow a sigmoidal pattern and how these patterns might vary within a given country between rural and urban settings. Among countries with more than 10 data points, between 15% and 38% showed a non-linear trajectory, depending on the indicator. Overall, countries' progress followed a sigmoidal trend, but some countries are making better progress and some worse progress than would be expected. We highlight several countries that are not on track to meet the MDG for water or sanitation, but whose access is accelerating, suggesting better performance during the coming years. Conversely, we also highlight several countries that have made sufficient progress to meet the MDG target, but in which access is decelerating. Patterns were heterogeneous and non-linearity was common. Characterization of these heterogeneous patterns will help policy makers allocate resources more effectively. For example, policy makers can identify countries that could make use of additional resources or might be in need of additional institutional capacity development to properly manage resources; this will be essential to meet the forthcoming Sustainable Development Goals. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Electromechanical actuators affected by multiple failures: Prognostic method based on spectral analysis techniques

    Science.gov (United States)

    Belmonte, D.; Vedova, M. D. L. Dalla; Ferro, C.; Maggiore, P.

    2017-06-01

    The proposal of prognostic algorithms able to identify precursors of incipient failures of primary flight command electromechanical actuators (EMA) is beneficial for the anticipation of the incoming failure: an early and correct interpretation of the failure degradation pattern, in fact, can trig an early alert of the maintenance crew, who can properly schedule the servomechanism replacement. An innovative prognostic model-based approach, able to recognize the EMA progressive degradations before his anomalous behaviors become critical, is proposed: the Fault Detection and Identification (FDI) of the considered incipient failures is performed analyzing proper system operational parameters, able to put in evidence the corresponding degradation path, by means of a numerical algorithm based on spectral analysis techniques. Subsequently, these operational parameters will be correlated with the actual EMA health condition by means of failure maps created by a reference monitoring model-based algorithm. In this work, the proposed method has been tested in case of EMA affected by combined progressive failures: in particular, partial stator single phase turn to turn short-circuit and rotor static eccentricity are considered. In order to evaluate the prognostic method, a numerical test-bench has been conceived. Results show that the method exhibit adequate robustness and a high degree of confidence in the ability to early identify an eventual malfunctioning, minimizing the risk of fake alarms or unannounced failures.

  20. Small Renal Masses Progressing to Metastases under Active Surveillance: A Systematic Review and Pooled Analysis

    Science.gov (United States)

    Smaldone, Marc C.; Kutikov, Alexander; Egleston, Brian L.; Canter, Daniel J.; Viterbo, Rosalia; Chen, David Y.T.; Jewett, Michael A.; Greenberg, Richard E.; Uzzo, Robert G.

    2012-01-01

    Purpose We conducted a systematic review and pooled analysis of small renal masses under active surveillance to identify progression risk and characteristics associated with metastases. Materials and Methods A MEDLINE search was performed to identify all clinical series reporting surveillance of localized renal masses. For studies reporting individual level data, clinical and radiographic characteristics of tumors without progression were compared to those progressing to metastases. Results 18 series (880 patients, 936 masses) met screening criteria from which 18 patients progressing to metastasis were identified (mean 40.2 months). Six studies (259 patients, 284 masses) provided individual level data for pooled analysis. With a mean follow up of 33.5±22.6 months, mean initial tumor diameter was 2.3±1.3 cm and mean linear growth rate was 0.31±0.38 cm/year. 65 masses (23%) exhibited zero net growth under surveillance; of which none progressed to metastasis. Pooled analysis revealed increased age (75.1±9.1 vs. 66.6±12.3 years, p=0.03), initial tumor diameter (4.1±2.1 vs. 2.3±1.3 cm, p<0.0001), initial estimated tumor volume (66.3±100.0 vs. 15.1±60.3 cm3, p<0.0001), linear growth rate (0.8±0.65 vs. 0.3±0.4 cm/yr, p=0.0001), and volumetric growth rate (27.1±24.9 vs. 6.2±27.5 cm3/yr, p<0.0001) in the progression cohort. Conclusions A substantial proportion of small renal masses remain radiographically static following an initial period of active surveillance. Progression to metastases occurs in a small percentage of patients and is generally a late event. These results indicate that in patients with competing health risks, radiographic surveillance may be an acceptable initial approach with delayed intervention reserved for those exhibiting significant linear or volumetric growth. PMID:21766302

  1. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  2. Trends in grazing emission x-ray analysis techniques

    International Nuclear Information System (INIS)

    Grieken, R. van; Tsuji, K.; Injuk, J.

    2000-01-01

    then, the detection limits imposed by the semiconductor industry roadmap can probably not be obtained by tube-excited GEXRF. Th perspectives for tube-excited GE-XRF are thus rather poor. Future developments imply the combination of GEXRF with synchrotron radiation excitation. Grazing-emission particle-induced X-ray emission (GE-PIXE) suffers of similar quantification Problems for material deposited on a carrier, but it makes PIXE a surface-sensitive technique, while normally the protons penetrate some tens of μm in the sample. Similarly, grazing-emission electron probe micro-analysis (GE-EPNIA) allows to selectively analyze particles on a flat carrier, allows surface sensitivities in the nm rather than μ range, and yields, in principle, a spatial resolution for chemical analysis similar to the size of the impinging electron beam, rather than of the electron-excited volume. Both GE-PIXE and GE-EPMA need to be explored more fully in the near future. (author)

  3. Rates of progression in diabetic retinopathy during different time periods: a systematic review and meta-analysis

    DEFF Research Database (Denmark)

    Wong, Tien Y; Mwamburi, Mkaya; Klein, Ronald

    2009-01-01

    This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends.......This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends....

  4. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    Directory of Open Access Journals (Sweden)

    Ricardo E. Izzo

    2011-06-01

    Full Text Available The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball, with 20.9 % – and by the two-handed over the head pass, with 18.2 %, and finally, one- or two-handed indirect passes (bounces, with 11.2 % and 9.8 %. Considering the most used pass in basketball, from the biomechanical point of view, the muscles involved in the correct movement consider all the muscles of the upper extremity, adding also the shoulder muscles as well as the body fixators (abdominals, hip flexors, knee extensors, and dorsal flexors of the foot. The technical and conditional analysis considers the throwing speed, the throw height and the air resistance. In conclusion, the aim of this study is to give some guidelines to improve the mechanical execution of the movements in training, without neglecting the importance of the harmony of the movements themselves.

  5. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  6. Analysis of Program Obfuscation Schemes with Variable Encoding Technique

    Science.gov (United States)

    Fukushima, Kazuhide; Kiyomoto, Shinsaku; Tanaka, Toshiaki; Sakurai, Kouichi

    Program analysis techniques have improved steadily over the past several decades, and software obfuscation schemes have come to be used in many commercial programs. A software obfuscation scheme transforms an original program or a binary file into an obfuscated program that is more complicated and difficult to analyze, while preserving its functionality. However, the security of obfuscation schemes has not been properly evaluated. In this paper, we analyze obfuscation schemes in order to clarify the advantages of our scheme, the XOR-encoding scheme. First, we more clearly define five types of attack models that we defined previously, and define quantitative resistance to these attacks. Then, we compare the security, functionality and efficiency of three obfuscation schemes with encoding variables: (1) Sato et al.'s scheme with linear transformation, (2) our previous scheme with affine transformation, and (3) the XOR-encoding scheme. We show that the XOR-encoding scheme is superior with regard to the following two points: (1) the XOR-encoding scheme is more secure against a data-dependency attack and a brute force attack than our previous scheme, and is as secure against an information-collecting attack and an inverse transformation attack as our previous scheme, (2) the XOR-encoding scheme does not restrict the calculable ranges of programs and the loss of efficiency is less than in our previous scheme.

  7. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  8. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  9. Pregnancy and HIV disease progression: a systematic review and meta-analysis.

    Science.gov (United States)

    Calvert, Clara; Ronsmans, Carine

    2015-02-01

    To assess whether pregnancy accelerates HIV disease progression. Studies comparing progression to HIV-related illness, low CD4 count, AIDS-defining illness, HIV-related death, or any death in HIV-infected pregnant and non-pregnant women were included. Relative risks (RR) for each outcome were combined using random effects meta-analysis and were stratified by antiretroviral therapy (ART) availability. 15 studies met the inclusion criteria. Pregnancy was not associated with progression to HIV-related illness [summary RR: 1.32, 95% confidence interval (CI): 0.66-2.61], AIDS-defining illness (summary RR: 0.97, 95% CI: 0.74-1.25) or mortality (summary RR: 0.97, 95% CI: 0.62-1.53), but there was an association with low CD4 counts (summary RR: 1.41, 95% CI: 0.99-2.02) and HIV-related death (summary RR: 1.65, 95% CI: 1.06-2.57). In settings where ART was available, there was no evidence that pregnancy accelerated progress to HIV/AIDS-defining illnesses, death and drop in CD4 count. In settings without ART availability, effect estimates were consistent with pregnancy increasing the risk of progression to HIV/AIDS-defining illnesses and HIV-related or all-cause mortality, but there were too few studies to draw meaningful conclusions. In the absence of ART, pregnancy is associated with small but appreciable increases in the risk of several negative HIV outcomes, but the evidence is too weak to draw firm conclusions. When ART is available, the effects of pregnancy on HIV disease progression are attenuated and there is little reason to discourage healthy HIV-infected women who desire to become pregnant from doing so. © 2014 John Wiley & Sons Ltd.

  10. Nonlinear analysis of the progressive collapse of reinforced concrete plane frames using a multilayered beam formulation

    Directory of Open Access Journals (Sweden)

    C. E. M. Oliveira

    Full Text Available This work investigates the response of two reinforced concrete (RC plane frames after the loss of a column and their potential resistance for progressive collapse. Nonlinear dynamic analysis is performed using a multilayered Euler/Bernoulli beam element, including elasto-viscoplastic effects. The material nonlinearity is represented using one-dimensional constitutive laws in the material layers, while geometrical nonlinearities are incorporated within a corotational beam formulation. The frames were designed in accordance with the minimum requirements proposed by the reinforced concrete design/building codes of Europe (fib [1-2], Eurocode 2 [3] and Brazil (NBR 6118 [4]. The load combinations considered for PC analysis follow the prescriptions of DoD [5]. The work verifies if the minimum requirements of the considered codes are sufficient for enforcing structural safety and robustness, and also points out the major differences in terms of progressive collapse potential of the corresponding designed structures.

  11. Plasma Exchange for Renal Vasculitis and Idiopathic Rapidly Progressive Glomerulonephritis: A Meta-analysis

    DEFF Research Database (Denmark)

    Walsh, Michael; Catapano, Fausta; Szpirt, Wladimir

    2010-01-01

    BACKGROUND:: Plasma exchange may be effective adjunctive treatment for renal vasculitis. We performed a systematic review and meta-analysis of randomized controlled trials of plasma exchange for renal vasculitis. STUDY DESIGN:: Systematic review and meta-analysis of articles identified from...... electronic databases, bibliographies, and studies identified by experts. Data were abstracted in parallel by 2 reviewers. SETTING & POPULATION:: Adults with idiopathic renal vasculitis or rapidly progressive glomerulonephritis. SELECTION CRITERIA FOR STUDIES:: Randomized controlled trials that compared...... standard care with standard care plus adjuvant plasma exchange in adult patients with either renal vasculitis or idiopathic rapidly progressive glomerulonephritis. INTERVENTION:: Adjuvant plasma exchange. OUTCOME:: Composite of end-stage renal disease or death. RESULTS:: We identified 9 trials including...

  12. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  13. COMPARISON AND ANALYSIS OF VARIOUS HISTOGRAM EQUALIZATION TECHNIQUES

    OpenAIRE

    MADKI.M.R; RUBINA KHAN

    2012-01-01

    The intensity histogram gives information which can be used for contrast enhancement. The histogram equalization could be flat for levels less than the total number of levels. This could deteriorate the image. This problem can be overcome various techniques. This paper gives a comparative of the Bi-Histogram Equalization, Recursive Mean Seperated Histogram Equalization, Multipeak Histogram Equalization and Brightness Preserving Dynamic Histogram Equalization techniques by using these techniqu...

  14. Ultraviolet-Visible and Fluorescence Spectroscopy Techniques Are Important Diagnostic Tools during the Progression of Atherosclerosis: Diet Zinc Supplementation Retarded or Delayed Atherosclerosis

    Science.gov (United States)

    Abdelhalim, Mohamed Anwar K.; Moussa, Sherif A. Abdelmottaleb; AL-Mohy, Yanallah Hussain

    2013-01-01

    Background. In this study, we examined whether UV-visible and fluorescence spectroscopy techniques detect the progression of atherosclerosis in serum of rabbits fed on high-cholesterol diet (HCD) and HCD supplemented with zinc (HCD + Zn) compared with the control. Methods. The control rabbits group was fed on 100 g/day of normal diet. The HCD group was fed on Purina Certified Rabbit Chow supplemented with 1.0% cholesterol plus 1.0% olive oil (100 g/day) for the same period. The HCD + Zn group was fed on normal Purina Certified Rabbit Chow plus 1.0% cholesterol and 1.0% olive oil supplemented with 470 ppm Zn for the same feeding period. UV-visible and fluorescence spectroscopy and biochemistry in Rabbit's blood serum and blood hematology were measured in Rabbit's blood. Results. We found that the fluorescent peak of HCD shifted toward UV-visible wavelength compared with the control using fluorescent excitation of serum at 192 nm. In addition, they showed that supplementation of zinc (350 ppm) restored the fluorescent peak closely to the control. By using UV-visible spectroscopy approach, we found that the peak absorbance of HCD (about 280 nm) was higher than that of control and that zinc supplementation seemed to decrease the absorbance. Conclusions. This study demonstrates that ultraviolet-visible and fluorescence spectroscopy techniques can be applied as noninvasive techniques on a sample blood serum for diagnosing or detecting the progression of atherosclerosis. The Zn supplementation to rabbits fed on HCD delays or retards the progression of atherosclerosis. Inducing anemia in rabbits fed on HCD delays the progression of atherosclerosis. PMID:24350281

  15. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    International Nuclear Information System (INIS)

    Okrent, D.

    1989-01-01

    This final report summarizes the accomplishments of a two year research project entitled ''Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed

  16. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  17. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled ``Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  18. Integrating Conceptions of Human Progress

    Directory of Open Access Journals (Sweden)

    Rick Szostak

    2013-06-01

    Full Text Available This paper applies interdisciplinary techniques toward the investigation of the idea of human progress. It argues that progress needs to be considered with respect to an ethical evaluation of a host of different phenomena. Some of these have displayed progress in human history, others regress, and still others neither. It is argued that it is possible to achieve progress on all fronts in the future, but only if we engage constructively with the true complexity of the world we inhabit. Classification is seen as a critical complement to interdisciplinary analysis.

  19. Comparative Analysis of the Dark Ground Buffy Coat Technique (DG ...

    African Journals Online (AJOL)

    The prevalence of typanosome infection in 65 cattle reared under expensive system of management was determined using the dark ground buffy coat (DG) technique and the enzyme-linkedimmunisorbent assay (ELISA). The DG technique showed that there were 18 positive cases (27.69%) of total number of animals, made ...

  20. Advanced patch-clamp techniques and single-channel analysis

    NARCIS (Netherlands)

    Biskup, B; Elzenga, JTM; Homann, U; Thiel, G; Wissing, F; Maathuis, FJM

    Much of our knowledge of ion-transport mechanisms in plant cell membranes comes from experiments using voltage-clamp. This technique allows the measurement of ionic currents across the membrane, whilst the voltage is held under experimental control. The patch-clamp technique was developed to study

  1. Who pays for healthcare in Bangladesh? An analysis of progressivity in health systems financing.

    Science.gov (United States)

    Molla, Azaher Ali; Chi, Chunhuei

    2017-09-06

    The relationship between payments towards healthcare and ability to pay is a measure of financial fairness. Analysis of progressivity is important from an equity perspective as well as for macroeconomic and political analysis of healthcare systems. Bangladesh health systems financing is characterized by high out-of-pocket payments (63.3%), which is increasing. Hence, we aimed to see who pays what part of this high out-of-pocket expenditure. To our knowledge, this was the first progressivity analysis of health systems financing in Bangladesh. We used data from Bangladesh Household Income and Expenditure Survey, 2010. This was a cross sectional and nationally representative sample of 12,240 households consisting of 55,580 individuals. For quantification of progressivity, we adopted the 'ability-to-pay' principle developed by O'Donnell, van Doorslaer, Wagstaff, and Lindelow (2008). We used the Kakwani index to measure the magnitude of progressivity. Health systems financing in Bangladesh is regressive. Inequality increases due to healthcare payments. The differences between the Gini coefficient and the Kakwani index for all sources of finance are negative, which indicates regressivity, and that financing is more concentrated among the poor. Income inequality increases due to high out-of-pocket payments. The increase in income inequality caused by out-of-pocket payments is 89% due to negative vertical effect and 11% due to horizontal inequity. Our findings add substantial evidence of health systems financing impact on inequitable financial burden of healthcare and income. The heavy reliance on out-of-pocket payments may affect household living standards. If the government and people of Bangladesh are concerned about equitable financing burden, our study suggests that Bangladesh needs to reform the health systems financing scheme.

  2. A genetic analysis of Adh1 regulation. Progress report, June 1991--February 1992

    Energy Technology Data Exchange (ETDEWEB)

    Freeling, M.

    1992-03-01

    The overall goal of our research proposal is to understand the meaning of the various cis-acting sites responsible for AdH1 expression in the entire maize plant. Progress is reported in the following areas: Studies on the TATA box and analysis of revertants of the Adh1-3F1124 allele; screening for more different mutants that affect Adh1 expression differentially; studies on cis-acting sequences required for root-specific Adh1 expression; refinement of the use of the particle gun; and functional analysis of a non- glycolytic anaerobic protein.

  3. ERROR ANALYSIS FOR THE AIRBORNE DIRECT GEOREFERINCING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    A. S. Elsharkawy

    2016-10-01

    Full Text Available Direct Georeferencing was shown to be an important alternative to standard indirect image orientation using classical or GPS-supported aerial triangulation. Since direct Georeferencing without ground control relies on an extrapolation process only, particular focus has to be laid on the overall system calibration procedure. The accuracy performance of integrated GPS/inertial systems for direct Georeferencing in airborne photogrammetric environments has been tested extensively in the last years. In this approach, the limiting factor is a correct overall system calibration including the GPS/inertial component as well as the imaging sensor itself. Therefore remaining errors in the system calibration will significantly decrease the quality of object point determination. This research paper presents an error analysis for the airborne direct Georeferencing technique, where integrated GPS/IMU positioning and navigation systems are used, in conjunction with aerial cameras for airborne mapping compared with GPS/INS supported AT through the implementation of certain amount of error on the EOP and Boresight parameters and study the effect of these errors on the final ground coordinates. The data set is a block of images consists of 32 images distributed over six flight lines, the interior orientation parameters, IOP, are known through careful camera calibration procedure, also 37 ground control points are known through terrestrial surveying procedure. The exact location of camera station at time of exposure, exterior orientation parameters, EOP, is known through GPS/INS integration process. The preliminary results show that firstly, the DG and GPS-supported AT have similar accuracy and comparing with the conventional aerial photography method, the two technologies reduces the dependence on ground control (used only for quality control purposes. Secondly, In the DG Correcting overall system calibration including the GPS/inertial component as well as the

  4. Qualitative analysis of Orzooiyeh plain groundwater resources using GIS techniques

    Directory of Open Access Journals (Sweden)

    Mohsen Pourkhosravani

    2016-09-01

    Full Text Available Background: Unsustainable development of human societies, especially in arid and semi-arid areas, is one of the most important environmental hazards that require preservation of groundwater resources, and permanent study of qualitative and quantitative changes through sampling. Accordingly, this research attempts to assess and analyze the spatial variation of quantitative and qualitative indicators of Orzooiyeh groundwater resources in the Kerman province by using the geographic information system (GIS. Methods: This study attempts to survey the spatial analysis of these indexes using GIS techniques besides the evaluation of the groundwater resources quality in the study area. For this purpose, data quality indicators and statistics such as electrical conductivity, pH, sulphate, residual total dissolved solids (TDS, sodium, calcium; magnesium and chlorine of 28 selected wells sampled by the Kerman regional water organization were used. Results: A comparison of the present research results with standard of Industrial Research of Iran and also the World Health Organization (WHO shows that, among the measured indices, the electrical conductivity and TDS in the chosen samples are higher than the national standard of Iran and of the WHO but other indices are more favourable. Conclusion: Results showed that the electrical conductivity index of 64.3% of the samples have an optimal level, 71.4% have the limit of Iran national standard and only 3.6% of them have the WHO standard. The TDS index, too, did not reach national standards in any of the samples and in 82.1% of the samples this index was on the national standard limit. As per this index, only 32.1% of the samples were in the WHO standards.

  5. Progress of laser ionization mass spectrometry for elemental analysis - A review of the past decade

    International Nuclear Information System (INIS)

    Lin Yiming; Yu Quan; Hang Wei; Huang Benli

    2010-01-01

    Mass spectrometry using a laser ionization source has played a significant role in elemental analysis. Three types of techniques are widely used: high irradiance laser ionization mass spectrometry is capable of rapid determination of elements in solids; single particle mass spectrometry is a powerful tool for single particle characterization; and resonance ionization mass spectrometry is applied for isotope ratio measurements with high sensitivity and selectivity. In this review, the main features of the laser ablation process and plasma characterization by mass spectrometry are summarized. Applications of these three techniques for elemental analysis are discussed.

  6. Current landscape of protein glycosylation analysis and recent progress toward a novel paradigm of glycoscience research.

    Science.gov (United States)

    Yamamoto, Sachio; Kinoshita, Mitsuhiro; Suzuki, Shigeo

    2016-10-25

    This review covers the basics and some applications of methodologies for the analysis of glycoprotein glycans. Analytical techniques used for glycoprotein glycans, including liquid chromatography (LC), capillary electrophoresis (CE), mass spectrometry (MS), and high-throughput analytical methods based on microfluidics, were described to supply the essentials about biopharmaceutical and biomarker glycoproteins. We will also describe the MS analysis of glycoproteins and glycopeptides as well as the chemical and enzymatic releasing methods of glycans from glycoproteins and the chemical reactions used for the derivatization of glycans. We hope the techniques have accommodated most of the requests from glycoproteomics researchers. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Agreement among graders on Heidelberg retina tomograph (HRT) topographic change analysis (TCA) glaucoma progression interpretation.

    Science.gov (United States)

    Iester, Michele M; Wollstein, Gadi; Bilonick, Richard A; Xu, Juan; Ishikawa, Hiroshi; Kagemann, Larry; Schuman, Joel S

    2015-04-01

    To evaluate agreement among experts of Heidelberg retina tomography's (HRT) topographic change analysis (TCA) printout interpretations of glaucoma progression and explore methods for improving agreement. 109 eyes of glaucoma, glaucoma suspect and healthy subjects with ≥5 visits and 2 good quality HRT scans acquired at each visit were enrolled. TCA printouts were graded as progression or non-progression. Each grader was presented with 2 sets of tests: a randomly selected single test from each visit and both tests from each visit. Furthermore, the TCA printouts were classified with grader's individual criteria and with predefined criteria (reproducible changes within the optic nerve head, disregarding changes along blood vessels or at steep rim locations and signs of image distortion). Agreement among graders was modelled using common latent factor measurement error structural equation models for ordinal data. Assessment of two scans per visit without using the predefined criteria reduced overall agreement, as indicated by a reduction in the slope, reflecting the correlation with the common factor, for all graders with no effect on reducing the range of the intercepts between the graders. Using the predefined criteria improved grader agreement, as indicated by the narrower range of intercepts among the graders compared with assessment using individual grader's criteria. A simple set of predefined common criteria improves agreement between graders in assessing TCA progression. The inclusion of additional scans from each visit does not improve the agreement. We, therefore, recommend setting standardised criteria for TCA progression evaluation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Progress in tear microdesiccate analysis by combining various transmitted-light microscope techniques.

    Science.gov (United States)

    Traipe-Salas, Felipe; Traipe-Castro, Leonidas; Salinas-Toro, Daniela; López, Daniela; Valenzuela, Felipe; Cartes, Christian; Toledo-Araya, Héctor; Pérez, Claudio; López Solís, Remigio

    2016-06-03

    Tear desiccation on a glass surface followed by transmitted-light microscopy has served as diagnostic test for dry eye. Four distinctive morphological domains (zones I, II, III and transition band) have been recently recognized in tear microdesiccates. Physicochemical dissimilarities among those domains hamper comprehensive microscopic examination of tear microdesiccates. Optimal observation conditions of entire tear microdesiccates are now investigated. One-μl aliquots of tear collected from individual healthy eyes were dried at ambient conditions on microscope slides. Tear microdesiccates were examined by combining low-magnification objective lenses with transmitted-light microscopy (brightfield, phase contrasts Ph1,2,3 and darkfield). Fern-like structures (zones II and III) were visible with all illumination methods excepting brightfield. Zone I was the microdesiccate domain displaying the most noticeable illumination-dependent variations, namely transparent band delimited by an outer rim (Ph1, Ph2), homogeneous compactly built structure (brightfield) or invisible domain (darkfield, Ph3). Intermediate positions of the condenser (BF/Ph1, Ph1/Ph2) showed a structured roughly cylindrical zone I. The transition band also varied from invisibility (brightfield) to a well-defined domain comprising interwoven filamentous elements (phase contrasts, darkfield). Imaging of entire tear microdesiccates by transmitted-light microscopy depends upon illumination. A more comprehensive description of tear microdesiccates can be achieved by combining illumination methods.

  9. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  10. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  11. Multi-platform genome-wide analysis of melanoma progression to brain metastasis

    Directory of Open Access Journals (Sweden)

    Diego M. Marzese

    2014-12-01

    Full Text Available Melanoma has a high tendency to metastasize to brain tissue. The understanding about the molecular alterations of early-stage melanoma progression to brain metastasis (MBM is very limited. Identifying MBM-specific genomic and epigenomic alterations is a key initial step in understanding its aggressive nature and identifying specific novel druggable targets. Here, we describe a multi-platform dataset generated with different stages of melanoma progression to MBM. This data includes genome-wide DNA methylation (Illumina HM450K BeadChip, gene expression (Affymetrix HuEx 1.0 ST array, single nucleotide polymorphisms (SNPs and copy number variation (CNV; Affymetrix SNP 6.0 array analyses of melanocyte cells (MNCs, primary melanoma tumors (PRMs, lymph node metastases (LNMs and MBMs. The analysis of this data has been reported in our recently published study (Marzese et al., 2014.

  12. Short analysis of a progressive distorsion problem (tension and cyclic torsion)

    International Nuclear Information System (INIS)

    Roche, Roland.

    1978-06-01

    Tests on ratcheting (or progressive distorsion) are in progress in Saclay. A thin tube is subjected to a constant tensile load and to a cyclic twist. The present paper is a short theoretial analysis of that case. A uniform strain and stress field is considered with a constant tensile stress P (primary stress) and a cyclic shearing strain. The shearing strain is known by the corresponding elastic equivalent stress intensity (TRESCA criterion). The cyclic range of the stress intensity is ΔQ (secondary stress range). Are examined the shake down condition and the incremental elongations with different constitutive equations of the material. Special attention is given to perfect plasticity and bilinear kinematic hardening results are presented, but it is believed that these materials mathematical models are simplistic and special experimental tests are proposed [fr

  13. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  14. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  15. Techniques for the Statistical Analysis of Observer Data

    National Research Council Canada - National Science Library

    Bennett, John G

    2001-01-01

    .... The two techniques are as follows: (1) fitting logistic curves to the vehicle data, and (2) using the Fisher Exact Test to compare the probability of detection of the two vehicles at each range...

  16. Analysis of neutron-reflectometry data by Monte Carlo technique

    CERN Document Server

    Singh, S

    2002-01-01

    Neutron-reflectometry data is collected in momentum space. The real-space information is extracted by fitting a model for the structure of a thin-film sample. We have attempted a Monte Carlo technique to extract the structure of the thin film. In this technique we change the structural parameters of the thin film by simulated annealing based on the Metropolis algorithm. (orig.)

  17. A Comparative Analysis of Machine Learning Techniques for Credit Scoring

    OpenAIRE

    Nwulu, Nnamdi; Oroja, Shola; İlkan, Mustafa

    2012-01-01

    Abstract Credit Scoring has become an oft researched topic in light of the increasing volatility of the global economy and the recent world financial crisis. Amidst the many methods used for credit scoring, machine learning techniques are becoming increasingly popular due to their efficient and accurate nature and relative simplicity. Furthermore machine learning techniques minimize the risk of human bias and error and maximize speed as they are able to perform computation...

  18. HIV disease progression by hormonal contraceptive method: secondary analysis of a randomized trial.

    Science.gov (United States)

    Stringer, Elizabeth M; Levy, Jens; Sinkala, Moses; Chi, Benjamin H; Matongo, Inutu; Chintu, Namwinga; Stringer, Jeffrey S A

    2009-07-17

    HIV-infected women need access to safe contraception. We hypothesized that women using depomedroxyprogesterone acetate (DMPA) contraception would have faster HIV disease progression than women using oral contraceptive pills (OCPs) and nonhormonal methods. In a previously reported trial, we randomized 599 HIV-infected women to the intrauterine device (IUD) or hormonal contraception. Women randomized to hormonal contraception chose between OCPs and DMPA. This analysis investigates the relationship between exposure to hormonal contraception and HIV disease progression [defined as death, becoming eligible for antiretroviral therapy (ART), or both]. Of the 595 women not on ART at the time of randomization, 302 were allocated to hormonal contraception, of whom 190 (63%) initiated DMPA and 112 (37%) initiated OCPs. Women starting IUD, OCPs, or DMPA were similar at baseline. Compared with women using the IUD, the adjusted hazard of death was not significantly increased among women using OCPs [1.24; 95% confidence interval (CI) 0.42-3.63] or DMPA (1.83; 95% CI 0.82-4.08). However, women using OCPs (adjusted hazard ratio (AHR) 1.69; 95% CI 1.09-2.64) or DMPA (AHR 1.56; 95% CI 1.08-2.26) trended toward an increased likelihood of becoming eligible for ART. Women exposed to OCPs (AHR 1.67; 95% CI 1.10-2.51) and DMPA (AHR 1.62; 95% CI 1.16-2.28) also had an increased hazard of meeting our composite disease progression outcome (death or becoming ART eligible) than women using the IUD. In this secondary analysis, exposure to OCPs or DMPA was associated with HIV disease progression among women not yet on ART. This finding, if confirmed elsewhere, would have global implications and requires urgent further investigation.

  19. Radiative neutron capture as a counting technique at pulsed spallation neutron sources: a review of current progress

    Science.gov (United States)

    Schooneveld, E. M.; Pietropaolo, A.; Andreani, C.; Perelli Cippo, E.; Rhodes, N. J.; Senesi, R.; Tardocchi, M.; Gorini, G.

    2016-09-01

    Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources.

  20. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    Science.gov (United States)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  1. Protein purification and analysis: next generation Western blotting techniques.

    Science.gov (United States)

    Mishra, Manish; Tiwari, Shuchita; Gomes, Aldrin V

    2017-11-01

    Western blotting is one of the most commonly used techniques in molecular biology and proteomics. Since western blotting is a multistep protocol, variations and errors can occur at any step reducing the reliability and reproducibility of this technique. Recent reports suggest that a few key steps, such as the sample preparation method, the amount and source of primary antibody used, as well as the normalization method utilized, are critical for reproducible western blot results. Areas covered: In this review, improvements in different areas of western blotting, including protein transfer and antibody validation, are summarized. The review discusses the most advanced western blotting techniques available and highlights the relationship between next generation western blotting techniques and its clinical relevance. Expert commentary: Over the last decade significant improvements have been made in creating more sensitive, automated, and advanced techniques by optimizing various aspects of the western blot protocol. New methods such as single cell-resolution western blot, capillary electrophoresis, DigiWest, automated microfluid western blotting and microchip electrophoresis have all been developed to reduce potential problems associated with the western blotting technique. Innovative developments in instrumentation and increased sensitivity for western blots offer novel possibilities for increasing the clinical implications of western blot.

  2. APPLICATION OF AEM IN PROGRESSIVE COLLAPSE DYNAMICS ANALYSIS OF R.C. STRUCTURES

    Directory of Open Access Journals (Sweden)

    Osama El-Mahdy

    2018-01-01

    Full Text Available The Finite Element Method (FEM and the other numerical strategies are viably actualized in linear and non-linear analysis of structures. Recently, a new displacement based on Applied Element Method (AEM has been developed. It is applicable for static and dynamic for both linear and non-linear analysis of framed and continuum structures. In AEM, the structural member is partitioned into virtual elements connected through normal and shear springs representing stresses and strains of certain portion of structure. FEM assumes the material as continuous and can indicate highly stressed region of structure, however it is difficult to model separation of element unless crack location is known. The main advantage of AEM is that it can track the structural collapse behavior going through all phases of the application of loads. In the current research, the application of AEM is illustrated through a non-linear dynamic analysis. Progressive collapse simulation is conducted using Extreme Loading for Structures software (ELS, which follows the AEM. The experimental and analytical works carried by Park et al. [17 and 28] for 1/5 scaled 3 and 5 stories reinforced concrete structures are used for verification. Good matching between the experimental and numerical results has been obtained using ELS. Therefore, it can be confirmed that ELS is capable in simulating the structures’ behavior up to collapse. Furthermore, a study has been made to investigate the effect of considering the floor slabs on progressive collapse. The results show that considering slab in progressive collapse analysis of multistory buildings is important as neglecting the slabs’ contribution leads to incorrect simulation and uneconomic design.

  3. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  4. Genome sequencing and analysis conferences. Progress report, August 15, 1993--August 15, 1994

    Energy Technology Data Exchange (ETDEWEB)

    Venter, J.C.

    1995-10-01

    The 14 plenary session presentations focused on nematode; yeast; fruit fly; plants; mycobacteria; and man. In addition there were presentations on a variety of technical innovations including database developments and refinements, bioelectronic genesensors, computer-assisted multiplex techniques, and hybridization analysis with DNA chip technology. This document includes only the session schedule.

  5. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    OpenAIRE

    Akshay Amolik; Niketan Jivane; Mahavir Bhandari; Dr.M.Venkatesan

    2015-01-01

    Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment a...

  6. Nucleic acid hybridization and radioimmunoassay techniques for studying the interrelationships among the progressive pneumonia viruses of sheep

    International Nuclear Information System (INIS)

    Weiss, M.J.

    1976-01-01

    In Section I of this thesis, experiments were performed to determine if three representative ''slow'' viruses of sheep VV, MV and PPV replicate by way of a DNA ''provirus'' in a manner similar to the RNA tumor viruses. The approach used was to determine if unique virus-specific DNA sequences not present in normal cells could be detected in the DNA of infected cell cultures. The results presented demonstrate that infection by VV, MV and PPV results in the synthesis of proviral DNA. Sections II and III examine the similarities among VV, MV and PPV. In Section II, the RNA genomes of these viruses were compared by nucleic acid hybridization. The homology among these viral RNAs was determined from the extensive competition of homologous viral RNA-cDNA hybrids by heterologous RNA and from the thermal stability of homologous and heterologous RNA-cDNA hybrids. The 70S RNAs of visna and maedi virus were indistinguishable but only partially homologous to that of progressive pneumonia virus. Section III describes the purification of the major internal protein component of VV, p27, the development of a radioimmunoassay to study its antigenic relatedness to the corresponding proteins of PPV and MV, and its use in the detection of cross-reacting proteins in progressive pneumonia virus infected sheep lung. The ability to detect unique virus-related DNA sequences and viral antigens in infected sheep tissues makes it now feasible to search for slow virus related DNA sequences and/or antigens in human diseases which bear resemblance to the slow diseases of sheep

  7. Evaluation of explicit finite-difference techniques for LMFBR safety analysis

    International Nuclear Information System (INIS)

    Bernstein, D.; Golden, R.D.; Gross, M.B.; Hofmann, R.

    1976-01-01

    In the past few years, the use of explicit finite-difference (EFD) and finite-element computer programs for reactor safety calculations has steadily increased. One of the major areas of application has been for the analysis of hypothetical core disruptive accidents in liquid metal fast breeder reactors. Most of these EFD codes were derived to varying degrees from the same roots, but the codes are large and have progressed rapidly, so there may be substantial differences among them in spite of a common ancestry. When this fact is coupled with the complexity of HCDA calculations, it is not possible to assure that independent calculations of an HCDA will produce substantially the same results. Given the extreme importance of nuclear safety, it is essential to be sure that HCDA analyses are correct, and additional code validation is therefore desirable. A comparative evaluation of HCDA computational techniques is being performed under an ERDA-sponsored program called APRICOT (Analysis of PRImary COntainment Transients). The philosophy, calculations, and preliminary results from this program are described in this paper

  8. Application of ultrasonic pulse velocity technique and image analysis in monitoring of the sintering process

    Directory of Open Access Journals (Sweden)

    Terzić A.

    2011-01-01

    Full Text Available Concrete which undergoes a thermal treatment before and during its life-service can be applied in plants operating at high temperature and as thermal insulation. Sintering occurs within a concrete structure in such conditions. Progression of sintering process can be monitored by the change of the porosity parameters determined with a nondestructive test method - ultrasonic pulse velocity and computer program for image analysis. The experiment has been performed on the samples of corundum and bauxite concrete composites. The apparent porosity of the samples thermally treated at 110, 800, 1000, 1300 and 1500ºC was primary investigated with a standard laboratory procedure. Sintering parameters were calculated from the creep testing. The loss of strength and material degradation occurred in concrete when it was subjected to the increased temperature and a compressive load. Mechanical properties indicate and monitor changes within microstructure. The level of surface deterioration after the thermal treatment was determined using Image Pro Plus program. Mechanical strength was estimated using ultrasonic pulse velocity testing. Nondestructive ultrasonic measurement was used as a qualitative description of the porosity change in specimens which is the result of the sintering process. The ultrasonic pulse velocity technique and image analysis proved to be reliable methods for monitoring of microstructural change during the thermal treatment and service life of refractory concrete.

  9. Performance analysis of two-way DF relay selection techniques

    Directory of Open Access Journals (Sweden)

    Samer Alabed

    2016-09-01

    Full Text Available This work proposes novel bi-directional dual-relay selection techniques based on Alamouti space-time block coding (STBC using the decode and forward (DF protocol and analyzes their performance. In the proposed techniques, two- and the three-phase relaying schemes are used to perform bi-directional communication between the communicating terminals via two selected single-antenna relays that employ the Alamouti STBC in a distributed fashion to achieve diversity and orthogonalization of the channels and hence improve the reliability of the system and enable the use of a symbol-wise detector. Furthermore, the network coding strategy applied at all relays is not associated with any power wastage for broadcasting data already known at any terminal, resulting in improved overall performance at the terminals. Our simulations confirm the analytical results and show a substantially improved bit error rate (BER performance of our proposed techniques compared with the current state of the art.

  10. Radon remedial techniques in buildings - analysis of French actual cases

    International Nuclear Information System (INIS)

    Dupuis, M.

    2004-01-01

    The IRSN has compiled a collection of solutions from data provided by the various decentralised government services in 31 French departments. Contributors were asked to provide a description of the building, as well as details of measured radon levels, the type of reduction technique adopted and the cost. Illustrative layouts, technical drawings and photographs were also requested, when available. Of the cases recorded, 85% are establishments open to the public (schools (70%), city halls (4%) and combined city halls and school houses (26%)), 11% are houses and 4% industrial buildings. IRSN obtained 27 real cases of remedial techniques used. The data were presented in the form of fact sheets. The primary aim of this exercise was to illustrate each of the radon reduction techniques that can be used in the different building types (with basement, ground bearing slab, crawl space). This investigation not only enabled us to show that combining passive and active techniques reduces the operating cost of the installation, but above all that it considerably improves the efficiency. The passive technique reduces the amount of radon in the building and thus reduces the necessary ventilation rate, which directly affects the cost of operating the installation. For the 27 cases recorded, we noted:(a) the application of 7 passive techniques: sealing of floors and semi-buried walls, together with improved aeration by installing ventilation openings or ventilation strips in the windows. Radon concentrations were reduced on average by a factor of 4.7. No measurement in excess of 400 Bq.m -3 (the limit recommended by the French public authorities) was obtained following completion of the works; (b) the application of 15 active techniques: depressurization of the underlying ground, crawl space or basement and/or pressurization of the building. Radon concentrations were reduced on average by a factor of 13.8. Radon concentrations of over 400 Bq.m -3 were measured in only 4 cases

  11. Genetic programming system for building block analysis to enhance data analysis and data mining techniques

    Science.gov (United States)

    Eick, Christoph F.; Sanz, Walter D.; Zhang, Ruijian

    1999-02-01

    Recently, many computerized data mining tools and environments have been proposed for finding interesting patterns in large data collections. These tools employ techniques that originate from research in various areas, such as machine learning, statistical data analysis, and visualization. Each of these techniques makes assumptions concerning the composition of the data collection to be analyzed. If the particular data collection does not meet these assumptions well, the technique usually performs poorly. For example, decision tree tools, such as C4.5, rely on rectangular approximations, which do not perform well if the boundaries between different classes have other shapes, such as a 45 degree line or elliptical shapes. However, if we could find a transformation f that transforms the original attribute space, in which class boundaries are more, better rectangular approximations could be obtained. In this paper, we address the problem of finding such transformations f. We describe the features of the tool, WOLS, whose goal is the discovery of ingredients for such transformation functions f, which we call building blocks. The tool employs genetic programming and symbolic regression for this purpose. We also present and discuss the results of case studies, using the building block analysis tool, in the areas of decision tree learning and regression analysis.

  12. Geotechnical Analysis of Paleoseismic Shaking Using Liquefaction Features: Part I. Major Updating of Analysis Techniques

    Science.gov (United States)

    Olson, Scott M.; Green, Russell A.; Obermeier, Stephen F.

    2003-01-01

    A new methodology is proposed for the geotechnical analysis of strength of paleoseismic shaking using liquefaction effects. The proposed method provides recommendations for selection of both individual and regionally located test sites, techniques for validation of field data for use in back-analysis, and use of a recently developed energy-based solution to back-calculate paleoearthquake magnitude and strength of shaking. The proposed method allows investigators to assess the influence of post-earthquake density change and aging. The proposed method also describes how the back-calculations from individual sites should be integrated into a regional assessment of paleoseismic parameters.

  13. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    1988-06-01

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  14. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  15. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  16. A quantitative analysis of rotary, ultrasonic and manual techniques to treat proximally flattened root canals

    Directory of Open Access Journals (Sweden)

    Fabiana Soares Grecca

    2007-04-01

    Full Text Available OBJECTIVE: The efficiency of rotary, manual and ultrasonic root canal instrumentation techniques was investigated in proximally flattened root canals. MATERIAL AND METHODS: Forty human mandibular left and right central incisors, lateral incisors and premolars were used. The pulp tissue was removed and the root canals were filled with red die. Teeth were instrumented using three techniques: (i K3 and ProTaper rotary systems; (ii ultrasonic crown-down technique; and (iii progressive manual technique. Roots were bisected longitudinally in a buccolingual direction. The instrumented canal walls were digitally captured and the images obtained were analyzed using the Sigma Scan software. Canal walls were evaluated for total canal wall area versus non-instrumented area on which dye remained. RESULTS: No statistically significant difference was found between the instrumentation techniques studied (p<0.05. CONCLUSION: The findings of this study showed that no instrumentation technique was 100% efficient to remove the dye.

  17. Progress on 129I analysis and its application in environmental and geological researches

    DEFF Research Database (Denmark)

    Fan, Yukun; Hou, Xiaolin; Zhou, Weijian

    2013-01-01

    the interferences, as well as preparation of suitable target for AMS measurement. The major applications in environmental and geological researches are reviewed, which mainly focus on the new progress and potential development in the future. The application of 129I in the investigation of radioactive contamination...... in environmental level. Based on its source terms, chemical properties and environmental behaviors, 129Ican be applied for geological dating in a range of 2–80Ma, investigation of formation and migration of hydrocarbon, circulation of ocean water, atmospheric process of iodine, as well as reconstruction...... of dispersion and migration of short-lived radioisotopes of iodine released from nuclear accidents. This article aims to summarize and critically compare the analytical techniques used for 129I measurement and chemical methods for separation of iodine from various sample matrices, purification from...

  18. Progression In The Concepts Of Cognitive Sense Wireless Networks - An Analysis Report

    Science.gov (United States)

    Ajay, V. P.; Nesasudha, M.

    2017-10-01

    This paper illustrates the conception of networks, their primary goals (from day one to the present), the changes it had to endure to get to its present form and the developments which are in progress and in store for further standardization. The analysis gives more importance to the specifics of the Cognitive Radio Networks, which makes use of the dynamic spectrum access procedures, framed for better utilization of our available spectrum resources. The main conceptual difficulties and current research trends are also discussed in terms of real time implementation.

  19. Experimental Analysis of Temperature Differences During Implant Site Preparation: Continuous Drilling Technique Versus Intermittent Drilling Technique.

    Science.gov (United States)

    Di Fiore, Adolfo; Sivolella, Stefano; Stocco, Elena; Favero, Vittorio; Stellini, Edoardo

    2018-02-01

    Implant site preparation through drilling procedures may cause bone thermonecrosis. The aim of this in vitro study was to evaluate, using a thermal probe, overheating at implant sites during osteotomies through 2 different drilling methods (continuous drilling technique versus intermittent drilling technique) using irrigation at different temperatures. Five implant sites 13 mm in length were performed on 16 blocks (fresh bovine ribs), for a total of 80 implant sites. The PT-100 thermal probe was positioned 5 mm from each site. Two physiological refrigerant solutions were used: one at 23.7°C and one at 6.0°C. Four experimental groups were considered: group A (continuous drilling with physiological solution at 23.7°C), group B (intermittent drilling with physiological solution at 23.7°C), group C (continuous drilling with physiological solution at 6.0°C), and group D (intermittent drilling with physiological solution at 6.0°C). The Wilcoxon rank-sum test (2-tailed) was used to compare groups. While there was no difference between group A and group B (W = 86; P = .45), statistically significant differences were observed between experimental groups A and C (W = 0; P =.0001), B and D (W = 45; P =.0005), and C and D (W = 41; P = .003). Implant site preparation did not affect the overheating of the bone. Statistically significant differences were found with the refrigerant solutions. Using both irrigating solutions, bone temperature did not exceed 47°C.

  20. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  1. An analysis of batting backlift techniques among coached and ...

    African Journals Online (AJOL)

    One of the first principles of cricket batsmanship for batting coaches is to teach junior cricketers to play using a straight bat. This requires the bat to be lifted directly towards the stumps with the bat face facing downwards. No study has yet examined whether there are differences in the batting back lift techniques (BTT) of ...

  2. Protease analysis by zymography: a review on techniques and patents.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2009-01-01

    Zymography, the detection of enzymatic activity on gel electrophoresis, has been a technique described in the literature for at least in the past 50 years. Although a diverse amount of enzymes, especially proteases, have been detected, advances and improvements have been slower in comparison with other molecular biology, biotechnology and chromatography techniques. Most of the reviews and patents published focus on the technique as an element for enzymatic testing, but detailed analytical studies are scarce. Patents referring to zymography per se are few and the technique itself is hardly an important issue in titles or keywords in many scientific publications. This review covers a small condensation of the works published so far dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations like 2-D zymography, real-time zymography, and in-situ zymography. Moreover, a scope will be given to visualize the new tendencies of this method, regarding substrates used and activity visualization. What to expect from zymography in the near future is also approached.

  3. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four aque...

  4. Sixth Australian conference on nuclear techniques of analysis: proceedings

    International Nuclear Information System (INIS)

    1989-01-01

    These proceedings contain the abstracts of 77 lectures. The topics focus on instrumentation, nuclear techniques and their applications for material science, surfaces, archaeometry, art, geological, environmental and biomedical studies. An outline of the Australian facilities available for research purposes is also provided. Separate abstracts were prepared for the individual papers in this volume

  5. Tape Stripping Technique for Stratum Corneum Protein Analysis

    DEFF Research Database (Denmark)

    Clausen, Maja-Lisa; Slotved, H.-C.; Krogfelt, Karen Angeliki

    2016-01-01

    The aim of this study was to investigate the amount of protein in stratum corneum in atopic dermatitis (AD) patients and healthy controls, using tape stripping technique. Furthermore, to compare two different methods for protein assessment. Tape stripping was performed in AD patients and healthy ...

  6. Alternative Colposcopy Techniques: A Systematic Review and Meta-analysis

    NARCIS (Netherlands)

    Hermens, M.; Ebisch, R.M.F.; Galaal, K.; Bekkers, R.L.M.

    2016-01-01

    OBJECTIVE: To assess the diagnostic value of alternative (digital) colposcopy techniques for detection of cervical intraepithelial neoplasia (CIN) 2 or worse in a colposcopy population. DATA SOURCES: MEDLINE, EMBASE, ClinicalTrials.gov, and the Cochrane Library were searched from inception up to

  7. VLBI FOR GRAVITY PROBE B. IV. A NEW ASTROMETRIC ANALYSIS TECHNIQUE AND A COMPARISON WITH RESULTS FROM OTHER TECHNIQUES

    International Nuclear Information System (INIS)

    Lebach, D. E.; Ratner, M. I.; Shapiro, I. I.; Bartel, N.; Bietenholz, M. F.; Lederman, J. I.; Ransom, R. R.; Campbell, R. M.; Gordon, D.; Lestrade, J.-F.

    2012-01-01

    When very long baseline interferometry (VLBI) observations are used to determine the position or motion of a radio source relative to reference sources nearby on the sky, the astrometric information is usually obtained via (1) phase-referenced maps or (2) parametric model fits to measured fringe phases or multiband delays. In this paper, we describe a 'merged' analysis technique which combines some of the most important advantages of these other two approaches. In particular, our merged technique combines the superior model-correction capabilities of parametric model fits with the ability of phase-referenced maps to yield astrometric measurements of sources that are too weak to be used in parametric model fits. We compare the results from this merged technique with the results from phase-referenced maps and from parametric model fits in the analysis of astrometric VLBI observations of the radio-bright star IM Pegasi (HR 8703) and the radio source B2252+172 nearby on the sky. In these studies we use central-core components of radio sources 3C 454.3 and B2250+194 as our positional references. We obtain astrometric results for IM Peg with our merged technique even when the source is too weak to be used in parametric model fits, and we find that our merged technique yields astrometric results superior to the phase-referenced mapping technique. We used our merged technique to estimate the proper motion and other astrometric parameters of IM Peg in support of the NASA/Stanford Gravity Probe B mission.

  8. Glaucoma progression detection by retinal nerve fiber layer measurement using scanning laser polarimetry: event and trend analysis.

    Science.gov (United States)

    Moon, Byung Gil; Sung, Kyung Rim; Cho, Jung Woo; Kang, Sung Yong; Yun, Sung-Cheol; Na, Jung Hwa; Lee, Youngrok; Kook, Michael S

    2012-06-01

    To evaluate the use of scanning laser polarimetry (SLP, GDx VCC) to measure the retinal nerve fiber layer (RNFL) thickness in order to evaluate the progression of glaucoma. Test-retest measurement variability was determined in 47 glaucomatous eyes. One eye each from 152 glaucomatous patients with at least 4 years of follow-up was enrolled. Visual field (VF) loss progression was determined by both event analysis (EA, Humphrey guided progression analysis) and trend analysis (TA, linear regression analysis of the visual field index). SLP progression was defined as a reduction of RNFL exceeding the predetermined repeatability coefficient in three consecutive exams, as compared to the baseline measure (EA). The slope of RNFL thickness change over time was determined by linear regression analysis (TA). Twenty-two eyes (14.5%) progressed according to the VF EA, 16 (10.5%) by VF TA, 37 (24.3%) by SLP EA and 19 (12.5%) by SLP TA. Agreement between VF and SLP progression was poor in both EA and TA (VF EA vs. SLP EA, k = 0.110; VF TA vs. SLP TA, k = 0.129). The mean (±standard deviation) progression rate of RNFL thickness as measured by SLP TA did not significantly differ between VF EA progressors and non-progressors (-0.224 ± 0.148 µm/yr vs. -0.218 ± 0.151 µm/yr, p = 0.874). SLP TA and EA showed similar levels of sensitivity when VF progression was considered as the reference standard. RNFL thickness as measurement by SLP was shown to be capable of detecting glaucoma progression. Both EA and TA of SLP showed poor agreement with VF outcomes in detecting glaucoma progression.

  9. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  10. Models, Web-Based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance

    National Research Council Canada - National Science Library

    Hill, Raymond

    2001-01-01

    ... Laboratory, Logistics Research Division, Logistics Readiness Branch to propose a research agenda entitled, "Models, Web-based Simulations, and Integrated Analysis Techniques for Improved Logistical Performance...

  11. Progress in study of Prespa Lake using nuclear and related techniques (IAEA Regional Project RER/8/008)

    International Nuclear Information System (INIS)

    Anovski, Todor

    2001-09-01

    One of the main objective of the IAEA - Regional project RER/8/008 entitled Study of Prespa Lake Using Nuclear and Related Techniques was to provide a scientific basis for sustainable and environmental management of the Lake Prespa (Three lakes: Ohrid, Big Prespa and Small Prespa are on the borders between Albania, Republic of Macedonia and Greece, and are separated by the Mali i Thate and Galichica, mostly Carstificated mountains), see Fig. 1. In this sense investigations connected with the hydrogeology, water quality (Physics-chemical, biological and radiological characteristics) and water balance determination by application of Environmental isotopes ( i.e. H,D,T,O-18,O-18 etc.,) distribution, artificial water tracers and other relevant analytical techniques such as: AAS, HPLC, Total α and β-activity, α and γ-spectrometry as well as ultra sonic measurements (defining of the Lake bottom profile) through regional cooperation / Scientists from Albania, Greece and Republic of Macedonia, participated in the implementation of the Project/ during one hydrological year, had been initiated and valuable results obtained, a part of which are presented in this report. This cooperation was the only way for providing necessary data for better understanding beside the other, of the water quality of the Prespa Lake and its hydrological relationship to Ohrid Lake too, representing a unique regional hydro system in the world. (Author)

  12. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  13. Fault Tree Analysis with Temporal Gates and Model Checking Technique for Qualitative System Safety Analysis

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2010-01-01

    Fault tree analysis (FTA) has suffered from several drawbacks such that it uses only static gates and hence can not capture dynamic behaviors of the complex system precisely, and it is in lack of rigorous semantics, and reasoning process which is to check whether basic events really cause top events is done manually and hence very labor-intensive and time-consuming for the complex systems while it has been one of the most widely used safety analysis technique in nuclear industry. Although several attempts have been made to overcome this problem, they can not still do absolute or actual time modeling because they adapt relative time concept and can capture only sequential behaviors of the system. In this work, to resolve the problems, FTA and model checking are integrated to provide formal, automated and qualitative assistance to informal and/or quantitative safety analysis. Our approach proposes to build a formal model of the system together with fault trees. We introduce several temporal gates based on timed computational tree logic (TCTL) to capture absolute time behaviors of the system and to give concrete semantics to fault tree gates to reduce errors during the analysis, and use model checking technique to automate the reasoning process of FTA

  14. Educational Progress of Looked-After Children in England: A Study Using Group Trajectory Analysis.

    Science.gov (United States)

    Sutcliffe, Alastair G; Gardiner, Julian; Melhuish, Edward

    2017-09-01

    Looked-after children in local authority care are among the most disadvantaged, and measures of their well-being, including educational outcomes, are poorer than other children's. The study sample consisted of all children in England born in academic years 1993 to 1994 through 1997 to 1998 who were in local authority care at any point during the academic years 2005 to 2006 through 2012 to 2013 and for whom results of national tests in literacy and numeracy were available at ages 7, 11, and 16 ( N = 47 500). Group trajectory analysis of children's educational progress identified 5 trajectory groups: low achievement, late improvement, late decline, predominant, and high achievement. Being looked after earlier was associated with a higher probability of following a high achievement trajectory and a lower probability of following a late decline trajectory. For children first looked after between ages 7 and 16, having a longer total time looked after by age 16 was associated with a higher probability of following a high achievement trajectory. For children with poor outcomes at ages 7 and 11, being looked after by age 16 was associated with an increased chance of educational improvement by age 16. This study provides evidence that early entry into care can reduce the risk of poor educational outcomes. It also establishes group trajectory analysis as an effective method for analyzing the educational progress of looked-after children, with the particular strength that it allows factors associated with a late decline or improvement in educational progress to be identified. Copyright © 2017 by the American Academy of Pediatrics.

  15. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  16. [Research progress and application prospect of near infrared spectroscopy in soil nutrition analysis].

    Science.gov (United States)

    Ding, Hai-quan; Lu, Qi-peng

    2012-01-01

    "Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.

  17. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  18. New data mining technique for multidimensional aircraft trajectories analysis

    Directory of Open Access Journals (Sweden)

    Solntseva-Chaley Maria

    2016-01-01

    Full Text Available Under conditions of growing airport workload, airspace sectorization is necessary for accidents prevention. Sectorization should be performed taking into account regular traffic of aircrafts. A new data mining technique, solving the problem, is described. It allows to fulfill stable partition of aircraft intent trajectory sample into the groups (asymptotically converged beams, corresponding to the same runway approaches. Method is taking into account special geometric characteristics (curvature, torsion and multiple intersections of multidimensional space trajectories of aircrafts.

  19. Application of radioisotope techniques in analysis of environmental pollutants

    International Nuclear Information System (INIS)

    Kyrs, M.; Moravec, A.

    1984-01-01

    A survey is tabulated of the use of radioisotope techniques, giving the detected pollutant and the sensitivity and accuracy of the method. The most frequently used principle is the substoichiometric variant of isotope dilution which may be divided into the method of isotope dilution and the radio-reagent method. Both methods are described and examples are given of the determination of pollutants. (J.P.)

  20. Analysis of photoisomerizable dyes using laser absorption and fluorescence techniques

    International Nuclear Information System (INIS)

    Duchowicz, R.; Di Paolo, R.E.; Scaffardi, L.; Tocho, J.O.

    1992-01-01

    The attention of the present report has been directed mainly to the description of laser-based techniques developed in order to obtain kinetic and spectroscopic properties of polymethine cyanine dyes in solution. Special attention was dedicated to photoisomerizable molecules where the absorption spectra of both isomers are strongly overlapped. As an example, measurements of two different dyes of laser technological interest, DTCI and DODCI were performed. The developed methods provide a complete quantitative description of photophysical processes. (author). 14 refs, 6 figs

  1. An ASIC Low Power Primer Analysis, Techniques and Specification

    CERN Document Server

    Chadha, Rakesh

    2013-01-01

    This book provides an invaluable primer on the techniques utilized in the design of low power digital semiconductor devices.  Readers will benefit from the hands-on approach which starts form the ground-up, explaining with basic examples what power is, how it is measured and how it impacts on the design process of application-specific integrated circuits (ASICs).  The authors use both the Unified Power Format (UPF) and Common Power Format (CPF) to describe in detail the power intent for an ASIC and then guide readers through a variety of architectural and implementation techniques that will help meet the power intent.  From analyzing system power consumption, to techniques that can employed in a low power design, to a detailed description of two alternate standards for capturing the power directives at various phases of the design, this book is filled with information that will give ASIC designers a competitive edge in low-power design. Starts from the ground-up and explains what power is, how it is measur...

  2. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  3. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  4. Development of synchrotron x-ray micro-spectroscopic techniques and application to problems in low temperature geochemistry. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-10-01

    The focus of the technical development effort has been the development of apparatus and techniques for the utilization of X-ray Fluorescence (XRF), Extended X-ray Absorption Fine Structure (EXAFS) and X-ray Absorption Near Edge Structure (XANES) spectroscopies in a microprobe mode. The present XRM uses white synchrotron radiation (3 to 30 keV) from a bending magnet for trace element analyses using the x-ray fluorescence technique Two significant improvements to this device have been recently implemented. Focusing Mirror: An 8:1 ellipsoidal mirror was installed in the X26A beamline to focus the incident synchrotron radiation and thereby increase the flux on the sample by about a factor of 30. Incident Beam Monochromator: The monochromator has been successfully installed and commissioned in the X26A beamline upstream of the mirror to permit analyses with focused monochromatic radiation. The monochromator consists of a channel-cut silicon (111) crystal driven by a Klinger stepping motor translator. We have demonstrated the operating range of this instrument is 4 and 20 keV with 0.01 eV steps and produces a beam with a {approximately}10{sup {minus}4} energy bandwidth. The primary purpose of the monochromator is for x-ray absorption spectroscopy (XAS) measurements but it is also used for selective excitation in trace element microanalysis. To date, we have conducted XANES studies on Ti, Cr, Fe, Ce and U, spanning the entire accessible energy range and including both K and L edge spectra. Practical detection limits for microXANES are 10--100 ppM for 100 {mu}m spots.

  5. New trends and techniques in chromosome aberration analysis

    International Nuclear Information System (INIS)

    Bender, M.A.

    1978-01-01

    The following topics are discussed: automation of chromosome analysis; storage of fixed cells from cultures of lymphocytes obtained routinely during periodic employee medical examinations; analysis of banded chromosomes; identification of first division metaphases; sister chromatid exchange; and patterns of aberration induction

  6. Efficacy of the Principal Components Analysis Techniques Using ...

    African Journals Online (AJOL)

    Second, the paper reports results of principal components analysis after the artificial data were submitted to three commonly used procedures; scree plot, Kaiser rule, and modified Horn's parallel analysis, and demonstrate the pedagogical utility of using artificial data in teaching advanced quantitative concepts. The results ...

  7. Improvements in analysis techniques for segmented mirror arrays

    Science.gov (United States)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  8. Cross-impact analysis experimentation using two techniques to ...

    African Journals Online (AJOL)

    Cross-impact analysis relies on decision makers to provide marginal probability estimates of interdependent events. Generally, these have to be revised in order to ensure overall system coherency. This paper describes cross-impact analysis experimentation in which a Monte Carlo based approach and a dierence equation ...

  9. Plasma Exchange for Renal Vasculitis and Idiopathic Rapidly Progressive Glomerulonephritis: A Meta-analysis

    DEFF Research Database (Denmark)

    Walsh, Michael; Catapano, Fausta; Szpirt, Wladimir

    2010-01-01

    .9). LIMITATIONS:: Although the primary result was statistically significant, there is insufficient statistical information to reliably determine whether plasma exchange decreases the composite of end-stage renal disease or death. CONCLUSIONS:: Plasma exchange may decrease the composite end point of end......BACKGROUND:: Plasma exchange may be effective adjunctive treatment for renal vasculitis. We performed a systematic review and meta-analysis of randomized controlled trials of plasma exchange for renal vasculitis. STUDY DESIGN:: Systematic review and meta-analysis of articles identified from...... electronic databases, bibliographies, and studies identified by experts. Data were abstracted in parallel by 2 reviewers. SETTING & POPULATION:: Adults with idiopathic renal vasculitis or rapidly progressive glomerulonephritis. SELECTION CRITERIA FOR STUDIES:: Randomized controlled trials that compared...

  10. Aquatic Toxic Analysis by Monitoring Fish Behavior Using Computer Vision: A Recent Progress

    Directory of Open Access Journals (Sweden)

    Chunlei Xia

    2018-01-01

    Full Text Available Video tracking based biological early warning system achieved a great progress with advanced computer vision and machine learning methods. Ability of video tracking of multiple biological organisms has been largely improved in recent years. Video based behavioral monitoring has become a common tool for acquiring quantified behavioral data for aquatic risk assessment. Investigation of behavioral responses under chemical and environmental stress has been boosted by rapidly developed machine learning and artificial intelligence. In this paper, we introduce the fundamental of video tracking and present the pioneer works in precise tracking of a group of individuals in 2D and 3D space. Technical and practical issues suffered in video tracking are explained. Subsequently, the toxic analysis based on fish behavioral data is summarized. Frequently used computational methods and machine learning are explained with their applications in aquatic toxicity detection and abnormal pattern analysis. Finally, advantages of recent developed deep learning approach in toxic prediction are presented.

  11. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  12. Air pollution studies in Tianjing city using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    Ni Bangfa; Tian Weizhi; Nie Nuiling; Wang Pingsheng

    1999-01-01

    Two sites of airborne sampling from industrial and residential areas were made in Tianjing city during February and June using PM-10 sampler and analyzed by NAA techniques; Comparison of air pollution between urban and rural area in Tianjing city was made using neutron activation analysis techniques and some other data analyzing techniques. (author)

  13. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  14. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.

    1999-01-01

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  15. The Critic and the Computer: A Multiple Technique Analysis of the "ABC Evening News."

    Science.gov (United States)

    Bantz, Charles R.

    1979-01-01

    Analyzes "ABC Evening News" coverage of the 1972 presidential campaign with a rhetorical analytic technique and a computerized text analysis technique. Compares results and considers three possibilities: (1) the role of technique in research results; (2) different aspects of a phenomenon; and (3) the degree of confidence of results. (JMF)

  16. Uranium solution mining cost estimating technique: means for rapid comparative analysis of deposits

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    Twelve graphs provide a technique for determining relative cost ranges for uranium solution mining projects. The use of the technique can provide a consistent framework for rapid comparative analysis of various properties of mining situations. The technique is also useful to determine the sensitivities of cost figures to incremental changes in mining factors or deposit characteristics

  17. Meta-analysis of gene expression signatures defining the epithelial to mesenchymal transition during cancer progression.

    Directory of Open Access Journals (Sweden)

    Christian J Gröger

    Full Text Available The epithelial to mesenchymal transition (EMT represents a crucial event during cancer progression and dissemination. EMT is the conversion of carcinoma cells from an epithelial to a mesenchymal phenotype that associates with a higher cell motility as well as enhanced chemoresistance and cancer stemness. Notably, EMT has been increasingly recognized as an early event of metastasis. Numerous gene expression studies (GES have been conducted to obtain transcriptome signatures and marker genes to understand the regulatory mechanisms underlying EMT. Yet, no meta-analysis considering the multitude of GES of EMT has been performed to comprehensively elaborate the core genes in this process. Here we report the meta-analysis of 18 independent and published GES of EMT which focused on different cell types and treatment modalities. Computational analysis revealed clustering of GES according to the type of treatment rather than to cell type. GES of EMT induced via transforming growth factor-β and tumor necrosis factor-α treatment yielded uniformly defined clusters while GES of models with alternative EMT induction clustered in a more complex fashion. In addition, we identified those up- and downregulated genes which were shared between the multitude of GES. This core gene list includes well known EMT markers as well as novel genes so far not described in this process. Furthermore, several genes of the EMT-core gene list significantly correlated with impaired pathological complete response in breast cancer patients. In conclusion, this meta-analysis provides a comprehensive survey of available EMT expression signatures and shows fundamental insights into the mechanisms that are governing carcinoma progression.

  18. Analysis of the changes in keratoplasty indications and preferred techniques.

    Directory of Open Access Journals (Sweden)

    Stefan J Lang

    Full Text Available Recently, novel techniques introduced to the field of corneal surgery, e.g. Descemet membrane endothelial keratoplasty (DMEK and corneal crosslinking, extended the therapeutic options. Additionally contact lens fitting has developed new alternatives. We herein investigated, whether these techniques have affected volume and spectrum of indications of keratoplasties in both a center more specialized in treating Fuchs' dystrophy (center 1 and a second center that is more specialized in treating keratoconus (center 2.We retrospectively reviewed the waiting lists for indication, transplantation technique and the patients' travel distances to the hospital at both centers.We reviewed a total of 3778 procedures. Fuchs' dystrophy increased at center 1 from 17% (42 to 44% (150 and from 13% (27 to 23% (62 at center 2. In center 1, DMEK increased from zero percent in 2010 to 51% in 2013. In center 2, DMEK was not performed until 2013. The percentage of patients with keratoconus slightly decreased from 15% (36 in 2009 vs. 12% (40 in 2013 in center 1. The respective percentages in center 2 were 28% (57 and 19% (51. In both centers, the patients' travel distances increased.The results from center 1 suggest that DMEK might increase the total number of keratoplasties. The increase in travel distance suggests that this cannot be fully attributed to recruiting the less advanced patients from the hospital proximity. The increase is rather due to more referrals from other regions. The decrease of keratoconus patients in both centers is surprising and may be attributed to optimized contact lens fitting or even to the effect corneal crosslinking procedure.

  19. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    OpenAIRE

    Ricardo E. Izzo; Luca Russo

    2011-01-01

    The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball...

  20. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  1. Risk analysis of geothermal power plants using Failure Modes and Effects Analysis (FMEA) technique

    International Nuclear Information System (INIS)

    Feili, Hamid Reza; Akar, Navid; Lotfizadeh, Hossein; Bairampour, Mohammad; Nasiri, Sina

    2013-01-01

    Highlights: • Using Failure Modes and Effects Analysis (FMEA) to find potential failures in geothermal power plants. • We considered 5 major parts of geothermal power plants for risk analysis. • Risk Priority Number (RPN) is calculated for all failure modes. • Corrective actions are recommended to eliminate or decrease the risk of failure modes. - Abstract: Renewable energy plays a key role in the transition toward a low carbon economy and the provision of a secure supply of energy. Geothermal energy is a versatile source as a form of renewable energy that meets popular demand. Since some Geothermal Power Plants (GPPs) face various failures, the requirement of a technique for team engineering to eliminate or decrease potential failures is considerable. Because no specific published record of considering an FMEA applied to GPPs with common failure modes have been found already, in this paper, the utilization of Failure Modes and Effects Analysis (FMEA) as a convenient technique for determining, classifying and analyzing common failures in typical GPPs is considered. As a result, an appropriate risk scoring of occurrence, detection and severity of failure modes and computing the Risk Priority Number (RPN) for detecting high potential failures is achieved. In order to expedite accuracy and ability to analyze the process, XFMEA software is utilized. Moreover, 5 major parts of a GPP is studied to propose a suitable approach for developing GPPs and increasing reliability by recommending corrective actions for each failure mode

  2. Demonstration of innovative techniques for work zone safety data analysis

    Science.gov (United States)

    2009-07-15

    Based upon the results of the simulator data analysis, additional future research can be : identified to validate the driving simulator in terms of similarities with Ohio work zones. For : instance, the speeds observed in the simulator were greater f...

  3. Noble-TLBO MPPT Technique and its Comparative Analysis with Conventional methods implemented on Solar Photo Voltaic System

    Science.gov (United States)

    Patsariya, Ajay; Rai, Shiwani; Kumar, Yogendra, Dr.; Kirar, Mukesh, Dr.

    2017-08-01

    The energy crisis particularly with developing GDPs, has bring up to a new panorama of sustainable power source like solar energy, which has encountered huge development. Progressively high infiltration level of photovoltaic (PV) era emerges in keen matrix. Sunlight based power is irregular and variable, as the sun based source at the ground level is exceedingly subject to overcast cover inconstancy, environmental vaporized levels, and other climate parameters. The inalienable inconstancy of substantial scale sun based era acquaints huge difficulties with keen lattice vitality administration. Exact determining of sun powered power/irradiance is basic to secure financial operation of the shrewd framework. In this paper a noble TLBO-MPPT technique has been proposed to address the vitality of solar energy. A comparative analysis has been presented between conventional PO, IC and the proposed MPPT technique. The research has been done on Matlab Simulink software version 2013.

  4. Analysis of Self-Excited Combustion Instabilities Using Decomposition Techniques

    Science.gov (United States)

    2016-07-05

    and simulation (see Fig. 2). Frequency Fig. 1 LDI computational domain used for decomposition analysis. 2792 HUANG ETAL. D ow nl oa de d by A ir F...combustors. Since each proper orthogonal decomposition mode comprises multiple frequencies , specific modes of the pressure and heat release are not related...qualitative and less efficient for identifying physical mechanisms. On the other hand, dynamic mode decomposition analysis generates a global frequency

  5. Methodologies and techniques for analysis of network flow data

    Energy Technology Data Exchange (ETDEWEB)

    Bobyshev, A.; Grigoriev, M.; /Fermilab

    2004-12-01

    Network flow data gathered at the border routers and core switches is used at Fermilab for statistical analysis of traffic patterns, passive network monitoring, and estimation of network performance characteristics. Flow data is also a critical tool in the investigation of computer security incidents. Development and enhancement of flow based tools is an on-going effort. This paper describes the most recent developments in flow analysis at Fermilab.

  6. Privacy-Preserving Data Analysis Techniques by using different modules

    OpenAIRE

    Payal P. Wasankar; Prof. Arvind S. Kapse

    2013-01-01

    The competing parties who have private data may collaboratively conduct privacy preserving distributed data analysis (PPDA) tasks to learn beneficial data models or analysis results. For example, different credit card companies may try to build better models for credit card fraud detection through PPDA tasks. Similarly, competing companies in the same industry may try to combine their sales data to build models that may predict the future sales. In many of these cases, the competing parties h...

  7. Graphic analysis of resources by numerical evaluation techniques (Garnet)

    Science.gov (United States)

    Olson, A.C.

    1977-01-01

    An interactive computer program for graphical analysis has been developed by the U.S. Geological Survey. The program embodies five goals, (1) economical use of computer resources, (2) simplicity for user applications, (3) interactive on-line use, (4) minimal core requirements, and (5) portability. It is designed to aid (1) the rapid analysis of point-located data, (2) structural mapping, and (3) estimation of area resources. ?? 1977.

  8. Elementary steps of contraction probed by sinusoidal analysis technique in rabbit psoas fibers.

    Science.gov (United States)

    Kawai, M; Zhao, Y; Halvorson, H R

    1993-01-01

    Elementary steps of contraction were probed by sinusoidal analysis technique in skinned fibers from the rabbit psoas muscle during maximal Ca2+ activation (pCa 4.55-4.82) at 20 degrees C and 200 mM ionic strength. Our study included the effects of MgATP, MgADP, and Pi concentrations, and an ATP hydrolysis rate measurement. We increased the frequency range up to 350 Hz, and resolved an extra process (D), in addition to well defined processes (A), (B), and (C). Based on these studies, we established a cross-bridge scheme consisting of six attached states, one detached state, and transitions between these states. We deduced all kinetic constants to specify the scheme. The scheme uniquely explains our data, and no other scheme with an equal degree of simplicity could explain our data. We correlated process (D) to ATP isomerization, process (C) to cross-bridge detachment, and process (B) to cross-bridge attachment. We deduced the tension per cross-bridge state, which indicates that force is generated on cross-bridge attachment and before Pi-release. We also found that the rate constants of elementary steps become progressively slower starting from ATP binding to the myosin head and ending by ADP isomerization, and this stepwise slowing may be the essential and integral part of the energy transduction mechanism by muscle.

  9. First characterization of the expiratory flow increase technique: method development and results analysis

    International Nuclear Information System (INIS)

    Maréchal, L; Barthod, C; Jeulin, J C

    2009-01-01

    This study provides an important contribution to the definition of the expiratory flow increase technique (EFIT). So far, no measuring means were suited to assess the manual EFIT performed on infants. The proposed method aims at objectively defining the EFIT based on the quantification of pertinent cognitive parameters used by physiotherapists when practicing. We designed and realized customized instrumented gloves endowed with pressure and displacement sensors, and the associated electronics and software. This new system is specific to the manoeuvre, to the user and innocuous for the patient. Data were collected and analysed on infants with bronchiolitis managed by an expert physiotherapist. The analysis presented is realized on a group of seven subjects (mean age: 6.1 months, SD: 1.1; mean chest circumference: 44.8 cm, SD: 1.9). The results are consistent with the physiotherapist's tactility. In spite of inevitable variability due to measurements on infants, repeatable quantitative data could be reported regarding the manoeuvre characteristics: the magnitudes of displacements do not exceed 10 mm on both hands; the movement of the thoracic hand is more vertical than the movement of the abdominal hand; the maximum applied pressure with the thoracic hand is about twice higher than with the abdominal hand; the thrust of the manual compression lasts (590 ± 62) ms. Inter-operators measurements are in progress in order to generalize these results

  10. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  11. Progress report on neutron activation analysis at Dalat Nuclear Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Tuan, Nguyen Ngoc [Nuclear Research Institute, Dalat (Viet Nam)

    2003-03-01

    Neutron Activation Analysis (NAA) is one of most powerful techniques for the simultaneous multi-elements analysis. This technique has been studied and applied to analyze major, minor and trace elements in Geological, Biological and Environmental samples at Dalat Nuclear Research Reactor. At the sixth Workshop, February 8-11, 1999, Yojakarta, Indonesia we had a report on Current Status of Neutron Activation Analysis using Dalat Nuclear Research Reactor. Another report on Neutron Activation Analysis at the Dalat Nuclear Research Reactor also was presented at the seventh Workshop in Taejon, Korea from November 20-24, 2000. So in this report, we would like to present the results obtained of the application of NAA at NRI for one year as follows: (1) Determination of the concentrations of noble, rare earth, uranium, thorium and other elements in Geological samples according to requirement of clients particularly the geologists, who want to find out the mineral resources. (2) The analysis of concentration of radionuclides and nutrient elements in foodstuffs to attend the program on Asian Reference Man. (3) The evaluation of the contents of trace elements in crude oil and basement rock samples to determine original source of the oil. (4) Determination of the elemental composition of airborne particle in the Ho Chi Minh City for studying air pollution. The analytical data of standard reference material, toxic elements and natural radionuclides in seawater are also presented. (author)

  12. Progress report on neutron activation analysis at Dalat Nuclear Research Reactor

    International Nuclear Information System (INIS)

    Tuan, Nguyen Ngoc

    2003-01-01

    Neutron Activation Analysis (NAA) is one of most powerful techniques for the simultaneous multi-elements analysis. This technique has been studied and applied to analyze major, minor and trace elements in Geological, Biological and Environmental samples at Dalat Nuclear Research Reactor. At the sixth Workshop, February 8-11, 1999, Yojakarta, Indonesia we had a report on Current Status of Neutron Activation Analysis using Dalat Nuclear Research Reactor. Another report on Neutron Activation Analysis at the Dalat Nuclear Research Reactor also was presented at the seventh Workshop in Taejon, Korea from November 20-24, 2000. So in this report, we would like to present the results obtained of the application of NAA at NRI for one year as follows: (1) Determination of the concentrations of noble, rare earth, uranium, thorium and other elements in Geological samples according to requirement of clients particularly the geologists, who want to find out the mineral resources. (2) The analysis of concentration of radionuclides and nutrient elements in foodstuffs to attend the program on Asian Reference Man. (3) The evaluation of the contents of trace elements in crude oil and basement rock samples to determine original source of the oil. (4) Determination of the elemental composition of airborne particle in the Ho Chi Minh City for studying air pollution. The analytical data of standard reference material, toxic elements and natural radionuclides in seawater are also presented. (author)

  13. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  14. Progress as Compositional Lock-Freedom

    DEFF Research Database (Denmark)

    Carbone, Marco; Dardha, Ornela; Montesi, Fabrizio

    2014-01-01

    such definition to capture a more intuitive notion of context adequacy for checking progress. Interestingly, our new catalysers lead to a novel characterisation of progress in terms of the standard notion of lock-freedom. Guided by this discovery, we also develop a conservative extension of catalysers that does...... not depend on types, generalising the notion of progress to untyped session-based processes. We combine our results with existing techniques for lock-freedom, obtaining a new methodology for proving progress. Our methodology captures new processes wrt previous progress analysis based on session types....

  15. Assessing Progress and Pitfalls of the Millennium Development Goals in Zimbabwe: A Critical Analysis

    Directory of Open Access Journals (Sweden)

    Shepherd Mutangabende

    2016-12-01

    Full Text Available Zimbabwe adopted the Millennium Development Goals (MDGs at their inception in 2000 and it has trends of its progress in its attempt to attain these MDGs as indicated in progress reports since 2004, 2010, 2012 and 2015. In these reports optimistic trends are chiefly found in MDG2 on universal primary education which is Zimbabwe’s pride in Africa, MDG3 regarding gender parity in schools and MDG6 on HIV and AIDS. The country continues to face its biggest challenges in attaining MDG1 which is eliminating extreme poverty and hunger and MDG5 which is increase nurturing mortality, whereas all the objectives under these goals are dubious that would be attained at the cut-off date. It was unfortunate that, the inception of the MDGs coincided with the deepening of socioeconomic, political and environmental crisis in the country which made it very difficult for Zimbabwe to accomplish all of its MDGs. The focal motive of this study was to check the progress, policies, programmes and strategies which were in place to promote the attainment of the MDGs from 2000-2015 and other strategies or policies in place to attain the SDGs 2016-2030. This paper recommended that there is need for institutionalisation of SDGs that is aligning them with Zimbabwe Agenda for Sustainable Socioeconomic Transformation (Zim-Asset cluster; for instance, value accumulation and beneficiation, nourishment security, poverty extermination, social services and strengthening partnership with all stakeholders. The research uses intensive secondary data analysis from various sources including government gazette, journal articles, e-books, and government website, reports, published and unpublished books.

  16. Application of vibration monitoring and analysis technique in nuclear power station

    International Nuclear Information System (INIS)

    Sun Jian; Liu Daohe; Lu Qunxian

    2000-01-01

    Vibration monitoring and analysis technique can play an important role in ensuring nuclear safety, solving important equipment's failure, improving the sable, cost effective operation level and predictive maintenance, some samples are introduced and it highlights the importance of vibration monitoring and analysis technique in Daya bay nuclear power station safe, reliable and cost effective operation

  17. Modern Theory of Gratings Resonant Scattering: Analysis Techniques and Phenomena

    CERN Document Server

    Sirenko, Yuriy K

    2010-01-01

    Diffraction gratings are one of the most popular objects of analysis in electromagnetic theory. The requirements of applied optics and microwave engineering lead to many new problems and challenges for the theory of diffraction gratings, which force us to search for new methods and tools for their resolution. In Modern Theory of Gratings, the authors present results of the electromagnetic theory of diffraction gratings that will constitute the base of further development of this theory, which meet the challenges provided by modern requirements of fundamental and applied science. This volume covers: spectral theory of gratings (Chapter 1) giving reliable grounds for physical analysis of space-frequency and space-time transformations of the electromagnetic field in open periodic resonators and waveguides; authentic analytic regularization procedures (Chapter 2) that, in contradistinction to the traditional frequency-domain approaches, fit perfectly for the analysis of resonant wave scattering processes; paramet...

  18. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2011-01-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  19. Finite Element Modeling Techniques for Analysis of VIIP

    Science.gov (United States)

    Feola, Andrew J.; Raykin, J.; Gleason, R.; Mulugeta, Lealem; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.; Ethier, C. Ross

    2015-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP.

  20. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    enzymes. Despite the prospect of obtaining major improvement through metabolic engineering, this approach is, however, not expected to completely replace the classical approach to strain improvement-random mutagenesis followed by screening. Identification of the optimal genetic changes for improvement......Metabolic engineering has been defined as the purposeful modification of intermediary metabolism using recombinant DNA techniques. With this definition metabolic engineering includes: (1) inserting new pathways in microorganisms with the aim of producing novel metabolites, e.g., production...... of polyketides by Streptomyces; (2) production of heterologous peptides, e.g., production of human insulin, erythropoitin, and tPA; and (3) improvement of both new and existing processes, e.g., production of antibiotics and industrial enzymes. Metabolic engineering is a multidisciplinary approach, which involves...

  1. Analysis of ultrasonic techniques for monitoring milk coagulation during cheesemaking

    International Nuclear Information System (INIS)

    Budelli, E; Lema, P; Pérez, N; Negreira, C

    2012-01-01

    Experimental determination of time of flight and attenuation has been proposed in the literature as alternatives to monitoring the evolution of milk coagulation during cheese manufacturing. However, only laboratory scale procedures have been described. In this work, the use of ultrasonic time of flight and attenuation to determine cutting time and its feasibility to be applied at industrial scale were analyzed. Limitations to implement these techniques at industrial scale are shown experimentally. The main limitation of the use of time of flight is its strong dependence with temperature. Attenuation monitoring is affected by a thin layer of milk skin covering the transducer, which modifies the signal in a non-repetitive way. The results of this work can be used to develop alternative ultrasonic systems suitable for application in the dairy industry.

  2. Analysis of ultrasonic techniques for monitoring milk coagulation during cheesemaking

    Science.gov (United States)

    Budelli, E.; Pérez, N.; Lema, P.; Negreira, C.

    2012-12-01

    Experimental determination of time of flight and attenuation has been proposed in the literature as alternatives to monitoring the evolution of milk coagulation during cheese manufacturing. However, only laboratory scale procedures have been described. In this work, the use of ultrasonic time of flight and attenuation to determine cutting time and its feasibility to be applied at industrial scale were analyzed. Limitations to implement these techniques at industrial scale are shown experimentally. The main limitation of the use of time of flight is its strong dependence with temperature. Attenuation monitoring is affected by a thin layer of milk skin covering the transducer, which modifies the signal in a non-repetitive way. The results of this work can be used to develop alternative ultrasonic systems suitable for application in the dairy industry.

  3. Alignment measurements uncertainties for large assemblies using probabilistic analysis techniques

    CERN Document Server

    AUTHOR|(CDS)2090816; Almond, Heather

    Big science and ambitious industrial projects continually push forward with technical requirements beyond the grasp of conventional engineering techniques. Example of those are ultra-high precision requirements in the field of celestial telescopes, particle accelerators and aerospace industry. Such extreme requirements are limited largely by the capability of the metrology used, namely, it’s uncertainty in relation to the alignment tolerance required. The current work was initiated as part of Maria Curie European research project held at CERN, Geneva aiming to answer those challenges as related to future accelerators requiring alignment of 2 m large assemblies to tolerances in the 10 µm range. The thesis has found several gaps in current knowledge limiting such capability. Among those was the lack of application of state of the art uncertainty propagation methods in alignment measurements metrology. Another major limiting factor found was the lack of uncertainty statements in the thermal errors compensatio...

  4. Identifying Indicators of Progress in Thermal Spray Research Using Bibliometrics Analysis

    Science.gov (United States)

    Li, R.-T.; Khor, K. A.; Yu, L.-G.

    2016-12-01

    We investigated the research publications on thermal spray in the period of 1985-2015 using the data from Web of Science, Scopus and SciVal®. Bibliometrics analysis was employed to elucidate the country and institution distribution in various thermal spray research areas and to characterize the trends of topic change and technology progress. Results show that China, USA, Japan, Germany, India and France were the top countries in thermal spray research, and Xi'an Jiaotong University, Universite de Technologie Belfort-Montbeliard, Shanghai Institute of Ceramics, ETH Zurich, National Research Council of Canada, University of Limoges were among the top institutions that had high scholarly research output during 2005-2015. The terms of the titles, keywords and abstracts of the publications were analyzed by the Latent Dirichlet Allocation model and visually mapped using the VOSviewer software to reveal the progress of thermal spray technology. It is found that thermal barrier coating was consistently the main research area in thermal spray, and high-velocity oxy-fuel spray and cold spray developed rapidly in the last 10 years.

  5. Network analysis of breast cancer progression and reversal using a tree-evolving network algorithm.

    Directory of Open Access Journals (Sweden)

    Ankur P Parikh

    2014-07-01

    Full Text Available The HMT3522 progression series of human breast cells have been used to discover how tissue architecture, microenvironment and signaling molecules affect breast cell growth and behaviors. However, much remains to be elucidated about malignant and phenotypic reversion behaviors of the HMT3522-T4-2 cells of this series. We employed a "pan-cell-state" strategy, and analyzed jointly microarray profiles obtained from different state-specific cell populations from this progression and reversion model of the breast cells using a tree-lineage multi-network inference algorithm, Treegl. We found that different breast cell states contain distinct gene networks. The network specific to non-malignant HMT3522-S1 cells is dominated by genes involved in normal processes, whereas the T4-2-specific network is enriched with cancer-related genes. The networks specific to various conditions of the reverted T4-2 cells are enriched with pathways suggestive of compensatory effects, consistent with clinical data showing patient resistance to anticancer drugs. We validated the findings using an external dataset, and showed that aberrant expression values of certain hubs in the identified networks are associated with poor clinical outcomes. Thus, analysis of various reversion conditions (including non-reverted of HMT3522 cells using Treegl can be a good model system to study drug effects on breast cancer.

  6. Rate Dependent Multicontinuum Progressive Failure Analysis of Woven Fabric Composite Structures under Dynamic Impact

    Directory of Open Access Journals (Sweden)

    James Lua

    2004-01-01

    Full Text Available Marine composite materials typically exhibit significant rate dependent response characteristics when subjected to extreme dynamic loading conditions. In this work, a strain-rate dependent continuum damage model is incorporated with multicontinuum technology (MCT to predict damage and failure progression for composite material structures. MCT treats the constituents of a woven fabric composite as separate but linked continua, thereby allowing a designer to extract constituent stress/strain information in a structural analysis. The MCT algorithm and material damage model are numerically implemented with the explicit finite element code LS-DYNA3D via a user-defined material model (umat. The effects of the strain-rate hardening model are demonstrated through both simple single element analyses for woven fabric composites and also structural level impact simulations of a composite panel subjected to various impact conditions. Progressive damage at the constituent level is monitored throughout the loading. The results qualitatively illustrate the value of rate dependent material models for marine composite materials under extreme dynamic loading conditions.

  7. Proteomic analysis reveals novel proteins associated with progression and differentiation of colorectal carcinoma

    Directory of Open Access Journals (Sweden)

    Yi Gan

    2014-01-01

    Full Text Available Aim: The objective of this study is to characterize differential proteomic expression among well-differentiation and poor-differentiation colorectal carcinoma tissues and normal mucous epithelium. Materials and Methods: The study is based on quantitative 2-dimensional gel electrophoresis and analyzed by PDquest. Results: Excluding redundancies due to proteolysis and posttranslational modified isoforms of over 600 protein spots, 11 proteins were revealed as regulated with statistical variance being within the 95 th confidence level and were identified by peptide mass fingerprinting in matrix assisted laser desorption/ionization time-of-flight mass spectrometry. Progression-associated proteins belong to the functional complexes of tumorigenesis, proliferation, differentiation, metabolism, and the regulation of major histocompatibility complex processing and other functions. Partial but significant overlap was revealed with previous proteomics and transcriptomics studies in CRC. Among various differentiation stage of CRC tissues, we identified calreticulin precursor, MHC class I antigen (human leukocyte antigen A , glutathione S-transferase pi1, keratin 8, heat shock protein 27, tubulin beta chain, triosephosphate, fatty acid-binding protein, hemoglobin (deoxy mutant with val b 1 replaced by met (HBB, and zinc finger protein 312 (FEZF2. Conclusions: Their functional networks were analyzed by Ingenuity systems Ingenuity Pathways Analysis and revealed the potential roles as novel biomarkers for progression in various differentiation stages of CRC.

  8. [New progress on three-dimensional movement measurement analysis of human spine].

    Science.gov (United States)

    Qiu, Xiao-wen; He, Xi-jing; Huang, Si-hua; Liang, Bao-bao; Yu, Zi-rui

    2015-05-01

    Spinal biomechanics, especially the range of spine motion,has close connection with spinal surgery. The change of the range of motion (ROM) is an important indicator of diseases and injuries of spine, and the essential evaluating standards of effect of surgeries and therapies to spine. The analysis of ROM can be dated to the time of the invention of X-ray and even that before it. With the development of science and technology as well as the optimization of various types of calculation methods, diverse measuring methods have emerged, from imaging methods to non-imaging methods, from two-dimensional to three-dimensional, from measuring directly on the X-ray films to calculating automatically by computer. Analysis of ROM has made great progress, but there are some older methods cannot meet the needs of the times and disappear, some classical methods such as X-ray still have vitality. Combining different methods, three dimensions and more vivo spine research are the trend of analysis of ROM. And more and more researchers began to focus on vivo spine research. In this paper, the advantages and disadvantages of the methods utilized recently are presented through viewing recent literatures, providing reference and help for the movement analysis of spine.

  9. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The analysis of soil characteristics after a six-week remediation period indicated that the total heterotrophic bacterial counts increased in all treatment options while the organic carbon and total hydrocarbon content (THC) of the soils decreased with time across the various options. Option C (involving different levels of ...

  10. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    -tions. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engi-neer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described...

  11. Analysis of scintigrams by singular value decomposition (SVD) technique

    Energy Technology Data Exchange (ETDEWEB)

    Savolainen, S.E.; Liewendahl, B.K. (Helsinki Univ. (Finland). Dept. of Physics)

    1994-05-01

    The singular value decomposition (SVD) method is presented as a potential tool for analyzing gamma camera images. Mathematically image analysis is a study of matrixes as the standard scintigram is a digitized matrix presentation of the recorded photon fluence from radioactivity of the object. Each matrix element (pixel) consists of a number, which equals the detected counts of the object position. The analysis of images can be reduced to the analysis of the singular values of the matrix decomposition. In the present study the clinical usefulness of SVD was tested by analyzing two different kinds of scintigrams: brain images by single photon emission tomography (SPET), and liver and spleen planar images. It is concluded that SVD can be applied to the analysis of gamma camera images, and that it provides an objective method for interpretation of clinically relevant information contained in the images. In image filtering, SVD provides results comparable to conventional filtering. In addition, the study of singular values can be used for semiquantitation of radionuclide images as exemplified by brain SPET studies and liver-spleen planar studies. (author).

  12. Use of parametric and non-parametric survival analysis techniques ...

    African Journals Online (AJOL)

    This paper presents parametric and non-parametric survival analysis procedures that can be used to compare acaricides. The effectiveness of Delta Tick Pour On and Delta Tick Spray in knocking down tsetse flies were determined. The two formulations were supplied by Chemplex. The comparison was based on data ...

  13. Techniques and Considerations for FIA forest fragmentation analysis

    Science.gov (United States)

    Andrew J. Lister; Tonya W. Lister; Rachel Riemann; Mike Hoppus

    2002-01-01

    The Forest Inventory and Analysis unit of the Northeastern Research Station (NEFIA) is charged with inventorying and monitoring the Nation's forests. NEFIA has not gathered much information on forest fragmentation, but recent developments in computing and remote sensing technologies now make it possible to assess forest fragmentation on a regional basis. We...

  14. Sentiment analysis of Arabic tweets using text mining techniques

    Science.gov (United States)

    Al-Horaibi, Lamia; Khan, Muhammad Badruddin

    2016-07-01

    Sentiment analysis has become a flourishing field of text mining and natural language processing. Sentiment analysis aims to determine whether the text is written to express positive, negative, or neutral emotions about a certain domain. Most sentiment analysis researchers focus on English texts, with very limited resources available for other complex languages, such as Arabic. In this study, the target was to develop an initial model that performs satisfactorily and measures Arabic Twitter sentiment by using machine learning approach, Naïve Bayes and Decision Tree for classification algorithms. The datasets used contains more than 2,000 Arabic tweets collected from Twitter. We performed several experiments to check the performance of the two algorithms classifiers using different combinations of text-processing functions. We found that available facilities for Arabic text processing need to be made from scratch or improved to develop accurate classifiers. The small functionalities developed by us in a Python language environment helped improve the results and proved that sentiment analysis in the Arabic domain needs lot of work on the lexicon side.

  15. A Generalized Lanczos-QR Technique for Structural Analysis

    DEFF Research Database (Denmark)

    Vissing, S.

    . Alternatively, in a stability analysis, tangent stiffness and geometric stiffness matrices are introduced into an eigenvalue problem used to determine possible bifurcation points. The common basis for these types of problems is that the matrix equation describing the problem contains two real, symmetric...

  16. Instrumental Neutron Activation Analysis Technique using Subsecond Radionuclides

    DEFF Research Database (Denmark)

    Nielsen, H.K.; Schmidt, J.O.

    1987-01-01

    The fast irradiation facility Mach-1 installed at the Danish DR 3 reactor has been used in boron determinations by means of Instrumental Neutron Activation Analysis using12B with 20-ms half-life. The performance characteristics of the system are presented and boron determinations of NBS standard...

  17. Organic Tanks Safety Program: Advanced organic analysis FY 1996 progress report

    International Nuclear Information System (INIS)

    1996-09-01

    Major focus during the first part of FY96 was to evaluate using organic functional group concentrations to screen for energetics. Fourier transform infrared and Raman spectroscopy would be useful screening tools for determining C-H and COO- organic content in tank wastes analyzed in a hot cell. These techniques would be used for identifying tanks of potential safety concern that may require further analysis. Samples from Tanks 241-C-106 and -C-204 were analyzed; the major organic in C-106 was B2EHPA and in C-204 was TBP. Analyses of simulated wastes were also performed for the Waste Aging Studies Task; organics formed as a result of degradation were identified, and the original starting components were monitored quantitatively. Sample analysis is not routine and required considerable methods adaptation and optimization. Several techniques have been evaluated for directly analyzing chelator and chelator fragments in tank wastes: matrix-assisted laser desorption/ionization time-of-flight mass spectrometry and liquid chromatography with ultraviolet detection using Cu complexation. Although not directly funded by the Tanks Safety Program, the success of these techniques have implications for both the Flammable Gas and Organic Tanks Safety Programs

  18. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...

  19. Lipidomic analysis of epidermal lipids: a tool to predict progression of inflammatory skin disease in humans.

    Science.gov (United States)

    Li, Shan; Ganguli-Indra, Gitali; Indra, Arup K

    2016-05-01

    Lipidomics is the large-scale profiling and characterization of lipid species in a biological system using mass spectrometry. The skin barrier is mainly comprised of corneocytes and a lipid-enriched extracellular matrix. The major skin lipids are ceramides, cholesterol and free fatty acids (FFA). Lipid compositions are altered in inflammatory skin disorders with disrupted skin barrier such as atopic dermatitis (AD). Here we discuss some of the recent applications of lipidomics in human skin biology and in inflammatory skin diseases such as AD, psoriasis and Netherton syndrome. We also review applications of lipidomics in human skin equivalent and in pre-clinical animal models of skin diseases to gain insight into the pathogenesis of the skin disease. Expert commentary: Skin lipidomics analysis could be a fast, reliable and noninvasive tool to characterize the skin lipid profile and to monitor the progression of inflammatory skin diseases such as AD.

  20. Damage analysis and fundamental studies. Quarterly progress report, October--December 1978

    Energy Technology Data Exchange (ETDEWEB)

    Zwilsky, Klaus M.

    1979-05-01

    This report is the fourth in a series of Quarterly Technical Progress Reports on Damage Analysis and Fundamental Studies (DAFS) which is one element of the Fusion Reactor Materials Program, conducted in support of the Magnetic Fusion Energy Program. This report is organized along topical lines in parallel to a Program Plan of the same title (to be published) so that activities and accomplishments may be followed readily relative to the Program Plan. Thus, the work of a given laboratory may appear throughout the report. Chapters 1 and 2 report topics which are generic to all of the DAFS Program: DAFS Task Group Activities and Irradiation Test Facilities, respectively. Chapters 3, 4, and 5 report the work that is specific to each of the subtasks around which the program is structured: A) Environmental Characterization, B) Damage Production, and C) Damage Microstructure Evolution and Mechanical Behavior.