WorldWideScience

Sample records for analysis techniques progress

  1. Research Progress on Pesticide Residue Analysis Techniques in Agro-products

    Directory of Open Access Journals (Sweden)

    HE Ze-ying

    2016-07-01

    Full Text Available There are constant occurrences of acute pesticide poisoning among consumers and pesticide residue violations in agro-products import/export trading. Pesticide residue analysis is the important way to protect the food safety and the interest of import/export enterprises. There has been a rapid development in pesticide residue analysis techniques in recent years. In this review, the research progress in the past five years were discussed in the respects of samples preparation and instrument determination. The application, modification and development of the QuEChERS method in samples preparation and the application of tandem mass spectrometry and high resolution mass spectrometry were reviewed. And the implications for the future of the field were discussed.

  2. Nuclear and radiochemical techniques in chemical analysis. Progress report, June 1, 1975--July 31, 1976

    International Nuclear Information System (INIS)

    Finston, H.L.; Williams, E.T.

    1976-01-01

    There has been significant progress on the project to measure the neutron-capture cross sections of reactor produced radionuclides, in particular, centering on the problems with nuclides such as 22 Na which may have a resonance for thermal-neutron capture. The thermal capture cross section of less than 40 b has been verified for 54 Mn, and cadmium ratios have been determined for 184 Re in the V-11 and V-14 positions in the HFBR. Lutetium has been used as a neutron temperature monitor for the Brookhaven reactors. Preliminary results on the project to determine the effect of chemical state on the branching ratio in 58 Co are reported. Procedures for aerosol collection and analysis by proton-induced x-ray emission (PIXE) are reported. A program to analyze aerosols for polycyclic aromatic hydrocarbons has been initiated. Progress is reported on the experimental verification of the proposed acid-base hypothesis

  3. Nuclear and radiochemical techniques in chemical analysis. Progress report, 1 June 1974--31 May 1975

    International Nuclear Information System (INIS)

    Finston, H.L.; Williams, E.T.

    1975-01-01

    Progress on the project to determine the neutron-absorption cross section of 22 Na is described. Upper limits for the thermal-neutron cross sections of 88 Y and 139 Ce have been set at 100 barns. The experiment to search for a change in the ratio of electron capture to positron emission due to difference in chemical environment is underway. The mechanical aspects of the system for analysis by proton-induced x-ray emission are described. Recent results on solvent extraction of mercury by pure solvents and propylene carbonate are described. Recent measurements in a study of an acid-base hypothesis are described. (U.S.)

  4. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1977--July 31, 1978

    International Nuclear Information System (INIS)

    Finston, H.L.; Williams, E.T.

    1978-07-01

    The study of neutron-absorption cross sections of reactor-produced radionuclides has been completed, and results are reported for 22 Na, 126 I, 139 Ce, 88 Y, 184 Re, 182 Ta, 54 Mn, and 94 Zr. The results for 22 Na indicate the existence of a resonance in the thermal region which could explain the discrepancies in the published values for the thermal cross section. The results of air-sampling experiments are described as is the proton-induced x-ray emission system developed at Brooklyn College. Work on sample preparation and applications of the PIXE technique are given. Progress on a nuclear method to determine fluorine-containing gaseous compounds is reported. Work on solvent extraction with propylene carbonate and experiments involving an acid-base hypothesis are described

  5. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1978-July 31, 1979

    International Nuclear Information System (INIS)

    Finston, H.L.; Williams, E.T.

    1979-07-01

    Studies of homogeneous liquid-liquid extraction have been extended to include (1) a detailed determination of the phase diagram of the system propylene carbonate-water, (2) the extraction of a large variety of both monodentate and bidentate iron complexes, (3) the solvent extraction characteristics of analogues of propylene carbonate, (4) the behavior under pressure of the propylene carbonate water system, and (5) the extraction behavior of alkaline earth - TTA chelates. One consequence of these studies was the observation that the addition of ethanol to propylene carbonate-water or to isobutylene carbonate-water yields a single homogeneous phase. Subsequent evaporation of the ethanol restores the two immiscible phases. Past neutron activation analysis has been attempted for the heavy elements Pb, Bi, Tl at the Brookhaven HFBR (in- or near-core position) and at the Brookhaven CLIF facility. The latter appears more promising and we have initiated a collaborative program to use the CLIF facility. A milking system which can provide ca. 16 μCi of carrier-free 212 Pb was developed for use in an isotope dilution technique for lead. Collaboration with laboratories already determining trace lead by flameless Atomic Absorption or by concentration by electrodeposition into a hanging drop followed by Anodic stripping will be proposed. The Proton X-Ray Emission system has undergone marked improvement with the acquisition of a new high resolution Si(Li) detector and a new multi-channel analyzer system. Various techniques have been explored to dissolve and prepare samples for PIXE analysis and also for verification by Atomic Absorption analysis

  6. Nuclear and radiochemical techniques in chemical analysis. Progress report, August 1, 1978-July 31, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Finston, H. L.; Williams, E. T.

    1979-07-01

    Studies of homogeneous liquid-liquid extraction have been extended to include (1) a detailed determination of the phase diagram of the system propylene carbonate-water, (2) the extraction of a large variety of both monodentate and bidentate iron complexes, (3) the solvent extraction characteristics of analogues of propylene carbonate, (4) the behavior under pressure of the propylene carbonate water system, and (5) the extraction behavior of alkaline earth - TTA chelates. One consequence of these studies was the observation that the addition of ethanol to propylene carbonate-water or to isobutylene carbonate-water yields a single homogeneous phase. Subsequent evaporation of the ethanol restores the two immiscible phases. Past neutron activation analysis has been attempted for the heavy elements Pb, Bi, Tl at the Brookhaven HFBR (in- or near-core position) and at the Brookhaven CLIF facility. The latter appears more promising and we have initiated a collaborative program to use the CLIF facility. A milking system which can provide ca. 16 ..mu..Ci of carrier-free /sup 212/Pb was developed for use in an isotope dilution technique for lead. Collaboration with laboratories already determining trace lead by flameless Atomic Absorption or by concentration by electrodeposition into a hanging drop followed by Anodic stripping will be proposed. The Proton X-Ray Emission system has undergone marked improvement with the acquisition of a new high resolution Si(Li) detector and a new multi-channel analyzer system. Various techniques have been explored to dissolve and prepare samples for PIXE analysis and also for verification by Atomic Absorption analysis.

  7. Progress in thin film techniques

    International Nuclear Information System (INIS)

    Weingarten, W.

    1996-01-01

    Progress since the last Workshop is reported on superconducting accelerating RF cavities coated with thin films. The materials investigated are Nb, Nb 3 Sn, NbN and NbTiN, the techniques applied are diffusion from the vapour phase (Nb 3 Sn, NbN), the bronze process (Nb 3 Sn), and sputter deposition on a copper substrate (Nb, NbTiN). Specially designed cavities for sample evaluation by RF methods have been developed (triaxial cavity). New experimental techniques to assess the RF amplitude dependence of the surface resistance are presented (with emphasis on niobium films sputter deposited on copper). Evidence is increasing that they are caused by magnetic flux penetration into the surface layer. (R.P.)

  8. The progress of neutron induced prompt gamma analysis technique in 1988-2002

    International Nuclear Information System (INIS)

    Liu Yuren; Jing Shiwei

    2003-01-01

    The new development of the neutron induced prompt gamma-ray analysis (NIPGA) technology in 1988-2002 are described. The pulse fast-thermal neutron activation analysis method, which utilizes the inelastic reaction and capture reaction jointly is employed to measure the elemental content in the material more efficiently. The lifetime of the neutron generator is more than 10000 h and the capability of HPGe, TeZeCd and MCA (multi-channel analyser) reaches the high level. At the same time, Monte Carlo Library least-square method is used to solve the nonlinearity problem in the PGNAA (Prompt Gamma Neutron Activation Analysis)

  9. Development of scan analysis techniques employing a small computer. Progress report, August 1, 1974--July 31, 1975

    International Nuclear Information System (INIS)

    Kuhl, D.E.

    1975-01-01

    Progress is reported in the development of equipment and counting techniques for transverse section scanning of the brain following the administration of radiopharmaceuticals to evaluate regional blood flow. The scanning instrument has an array of 32 scintillation detectors that surround the head and scan data are analyzed using a small computer. (U.S.)

  10. Progress in automation, robotics and measuring techniques

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2015-01-01

    This book presents recent progresses in control, automation, robotics, and measuring techniques. It includes contributions of top experts in the fields, focused on both theory and industrial practice. The particular chapters present a deep analysis of a specific technical problem which is in general followed by a numerical analysis and simulation, and results of an implementation for the solution of a real world problem. The presented theoretical results, practical solutions and guidelines will be useful for both researchers working in the area of engineering sciences and for practitioners solving industrial problems.    .

  11. Progress in diagnostic techniques for sc cavities

    International Nuclear Information System (INIS)

    Reece, C.E.

    1988-01-01

    While routinely achieved performance characteristics of superconducting cavities have now reached a level which makes them useful in large scale applications, achieving this level has come only through the knowledge gained by systematic studies of performance limiting phenomena. Despite the very real progress that has been made, the routine performance of superconducting cavities still falls far short of both the theoretical expectations and the performance of a few exception examples. It is the task of systematically applied diagnostic techniques to reveal additional information concerning the response of superconducting surfaces to applied RF fields. Here recent developments in diagnostic techniques are discussed. 18 references, 12 figures

  12. [Progress in transgenic fish techniques and application].

    Science.gov (United States)

    Ye, Xing; Tian, Yuan-Yuan; Gao, Feng-Ying

    2011-05-01

    Transgenic technique provides a new way for fish breeding. Stable lines of growth hormone gene transfer carps, salmon and tilapia, as well as fluorescence protein gene transfer zebra fish and white cloud mountain minnow have been produced. The fast growth characteristic of GH gene transgenic fish will be of great importance to promote aquaculture production and economic efficiency. This paper summarized the progress in transgenic fish research and ecological assessments. Microinjection is still the most common used method, but often resulted in multi-site and multi-copies integration. Co-injection of transposon or meganuclease will greatly improve the efficiency of gene transfer and integration. "All fish" gene or "auto gene" should be considered to produce transgenic fish in order to eliminate misgiving on food safety and to benefit expression of the transferred gene. Environmental risk is the biggest obstacle for transgenic fish to be commercially applied. Data indicates that transgenic fish have inferior fitness compared with the traditional domestic fish. However, be-cause of the genotype-by-environment effects, it is difficult to extrapolate simple phenotypes to the complex ecological interactions that occur in nature based on the ecological consequences of the transgenic fish determined in the laboratory. It is critical to establish highly naturalized environments for acquiring reliable data that can be used to evaluate the environ-mental risk. Efficacious physical and biological containment strategies remain to be crucial approaches to ensure the safe application of transgenic fish technology.

  13. Progress involving new techniques for liposome preparation

    Directory of Open Access Journals (Sweden)

    Zhenjun Huang

    2014-08-01

    Full Text Available The article presents a review of new techniques being used for the preparation of liposomes. A total of 28 publications were examined. In addition to the theories, characteristics and problems associated with traditional methods, the advantages and drawbacks of the latest techniques were reviewed. In the light of developments in many relevant areas, a variety of new techniques are being used for liposome preparation and each of these new technique has particular advantages over conventional preparation methods. However, there are still some problems associated with these new techniques that could hinder their applications and further improvements are needed. Generally speaking, due to the introduction of these latest techniques, liposome preparation is now an improved procedure. These applications promote not only advances in liposome research but also the methods for their production on an industrial scale.

  14. Comparative Infection Progress Analysis of Lettuce big-vein virus and Mirafiori lettuce virus in Lettuce Crops by Developed Molecular Diagnosis Techniques.

    Science.gov (United States)

    Navarro, Jose A; Botella, Francisco; Maruhenda, Antonio; Sastre, Pedro; Sánchez-Pina, M Amelia; Pallas, Vicente

    2004-05-01

    ABSTRACT Nonisotopic molecular dot blot hybridization technique and multiplex reverse transcription-polymerase chain reaction assay for the specific detection of Lettuce big-vein virus (LBVV) and Mirafiori lettuce virus (MiLV) in lettuce tissue were developed. Both procedures were suitable for the specific detection of both viruses in a range of naturally infected lettuce plants from various Spanish production areas and seven different cultivars. The study of the distribution of both viruses in the plant revealed that the highest concentration of LBVV and MiLV occurred in roots and old leaves, respectively. LBVV infection progress in a lettuce production area was faster than that observed for MiLV. In spite of different rates of virus infection progress, most lettuce plants became infected with both viruses about 100 days posttransplant. The appearance of both viruses in lettuce crops was preceded by a peak in the concentration of resting spores and zoosporangia of the fungus vector Olpidium brassicae in lettuce roots.

  15. Automatic ultrasound technique to measure angle of progression during labor.

    Science.gov (United States)

    Conversano, F; Peccarisi, M; Pisani, P; Di Paola, M; De Marco, T; Franchini, R; Greco, A; D'Ambrogio, G; Casciaro, S

    2017-12-01

    To evaluate the accuracy and reliability of an automatic ultrasound technique for assessment of the angle of progression (AoP) during labor. Thirty-nine pregnant women in the second stage of labor, with fetus in cephalic presentation, underwent conventional labor management with additional translabial sonographic examination. AoP was measured in a total of 95 acquisition sessions, both automatically by an innovative algorithm and manually by an experienced sonographer, who was blinded to the algorithm outcome. The results obtained from the manual measurement were used as the reference against which the performance of the algorithm was assessed. In order to overcome the common difficulties encountered when visualizing by sonography the pubic symphysis, the AoP was measured by considering as the symphysis landmark its centroid rather than its distal point, thereby assuring high measurement reliability and reproducibility, while maintaining objectivity and accuracy in the evaluation of progression of labor. There was a strong and statistically significant correlation between AoP values measured by the algorithm and the reference values (r = 0.99, P < 0.001). The high accuracy provided by the automatic method was also highlighted by the corresponding high values of the coefficient of determination (r 2  = 0.98) and the low residual errors (root mean square error = 2°27' (2.1%)). The global agreement between the two methods, assessed through Bland-Altman analysis, resulted in a negligible mean difference of 1°1' (limits of agreement, 4°29'). The proposed automatic algorithm is a reliable technique for measurement of the AoP. Its (relative) operator-independence has the potential to reduce human errors and speed up ultrasound acquisition time, which should facilitate management of women during labor. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd. Copyright © 2017 ISUOG. Published by John Wiley & Sons Ltd.

  16. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  17. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  18. Using novel computer-assisted linguistic analysis techniques to assess the timeliness and impact of FP7 Health’s research – a work in progress report

    Energy Technology Data Exchange (ETDEWEB)

    Stanciauskas, V.; Brozaitis, H.; Manola, N.; Metaxas, O.; Galsworthy, M.

    2016-07-01

    This paper presents the ongoing developments of the ex-post evaluation of the Health theme in FP7 which will be finalised in early 2017. the evaluation was launched by DG Research and Innovation, European Commission. Among other questions the evaluation asked to assess the structuring effect of FP7 Health on the European Research Area dnd the timeliness of the research performed. To this end the evalaution team has applied two innovative computerassisted linguistic analysis techniques to adderss these questions, including dynamic topic modelling and network analysis of co-publications. The topic model built for this evaluation contributed to comprehensive mapping of FP7 Health's research activities and building of a dynamic topic model that has not been attempted in previous evalautions of the Framework Programmes. Our applied network analysiswas of co-publications proved to be a powerful tool in determining the structuring effect of the FP7 Health to a level of detail which was again not implemented in previous evaluations of EU-funded research programmes. (Author)

  19. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  20. Analysis and analytical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Batuecas Rodriguez, T [Department of Chemistry and Isotopes, Junta de Energia Nuclear, Madrid (Spain)

    1967-01-01

    The technology associated with the use of organic coolants in nuclear reactors depends to a large extent on the determination and control of their physical and chemical properties, and particularly on the viability, speed, sensitivity, precision and accuracy (depending on the intended usage) of the methods employed in detection and analytical determination. This has led to the study and development of numerous techniques, some specially designed for the extreme conditions involved in working with the types of product in question and others adapted from existing techniques. In the specific case of polyphenyl and hydropolyphenyl mixtures, which have been the principal subjects of study to date and offer greatest promise, the analytical problems are broadly as follows: Composition of initial product or virgin coolant composition of macro components and amounts of organic and inorganic impurities; Coolant during and after operation. Determination of gases and organic compounds produced by pyrolysis and radiolysis (degradation and polymerization products); Control of systems for purifying and regenerating the coolant after use. Dissolved pressurization gases; Detection of intermediate products during decomposition; these are generally very unstable (free radicals); Degree of fouling and film formation. Tests to determine potential formation of films; Corrosion of structural elements and canning materials; Health and safety. Toxicity, inflammability and impurities that can be activated. Although some of the above problems are closely interrelated and entail similar techniques, they vary as to degree of difficulty. Another question is the difficulty of distinguishing clearly between techniques for determining physical and physico-chemical properties, on one hand, and analytical techniques on the other. Any classification is therefore somewhat arbitrary (for example, in the case of dosimetry and techniques for determining mean molecular weights or electrical conductivity

  1. [Progress of Masquelet technique to repair bone defect].

    Science.gov (United States)

    Yin, Qudong; Sun, Zhenzhong; Gu, Sanjun

    2013-10-01

    To summarize the progress of Masquelet technique to repair bone defect. The recent literature concerning the application of Masquelet technique to repair bone defect was extensively reviewed and summarized. Masquelet technique involves a two-step procedure. First, bone cement is used to fill the bone defect after a thorough debridement, and an induced membrane structure surrounding the spacer formed; then the bone cement is removed after 6-8 weeks, and rich cancellous bone is implanted into the induced membrane. Massive cortical bone defect is repaired by new bone forming and consolidation. Experiments show that the induced membrane has vascular system and is also rich in vascular endothelial growth factor, transforming growth factor beta1, bone morphogenetic protein 2, and bone progenitor cells, so it has osteoinductive property; satisfactory results have been achieved in clinical application of almost all parts of defects, various types of bone defect and massive defect up to 25 cm long. Compared with other repair methods, Masquelet technique has the advantages of reliable effect, easy to operate, few complications, low requirements for recipient site, and wide application. Masquelet technique is an effective method to repair bone defect and is suitable for various types of bone defect, especially for bone defects caused by infection and tumor resection.

  2. Progress in diagnostic techniques for SC [superconducting] cavities

    International Nuclear Information System (INIS)

    Reece, C.E.

    1988-01-01

    Despite the very real progress that has been made, the routine performance of superconducting cavities still falls far short of both the theoretical expectations and the performance of afew exceptional examples. It is the task of systematically applied diagnostic techniques to reveal additional information concerning the response of superconducting surfaces to applied RF fields. In this paper we will direct our attention to discussions of recent developments in diagnostic techniqeus, such as thermometry in superfluid helium, and scanning laser acoustic microscopy. 18 refs., 12 figs

  3. Evolution of surgical techniques for a progressive risk reduction.

    Science.gov (United States)

    Amato, Bruno; Santoro, Mario; Izzo, Raffaele; Servillo, Giuseppe; Compagna, Rita; Di Domenico, Lorenza; Di Nardo, Veronica; Giugliano, Giuseppe

    2017-07-18

    Advanced age is a strong predictor of high perioperative mortality in surgical patients and patients aged 75 years and older have an elevated surgical risk, much higher than that of younger patients. Progressive advances in surgical techniques now make it possible to treat high-risk surgical patients with minimally invasive procedures. Endovascular techniques have revolutionized the treatment of several vascular diseases, in particular carotid stenosis, aortic pathologies, and severely incapacitating intermittent claudication or critical limb ischemia. The main advantages of the endovascular approach are the low complication rate, high rate of technical success and a good clinical outcome. Biliary stenting has improved the clinical status of severely ill patients with bile duct stricture before major surgery, and represents a good palliative therapy in the case of malignant biliary obstruction.

  4. Progress in wind tunnel experimental techniques for wind turbine?

    Institute of Scientific and Technical Information of China (English)

    Jingping XIAO; Li CHEN; Qiang WANG; Qiao WANG

    2016-01-01

    Based on the unsteady aerodynamics experiment (UAE) phase VI and the model experiment in controlled conditions (MEXICO) projects and the related research carried out in China Aerodynamic Research and Development Center (CARDC), the recent progress in the wind tunnel experimental techniques for the wind turbine is sum-marized. Measurement techniques commonly used for di?erent types of wind tunnel ex-periments for wind turbine are reviewed. Important research achievements are discussed, such as the wind tunnel disturbance, the equivalence of the airfoil in?ow condition, the three-dimensional (3D) e?ect, the dynamic in?ow in?uence, the ?ow ?eld structure, and the vortex induction. The corresponding research at CARDC and some ideas on the large wind turbine are also introduced.

  5. Multivariate analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bendavid, Josh [European Organization for Nuclear Research (CERN), Geneva (Switzerland); Fisher, Wade C. [Michigan State Univ., East Lansing, MI (United States); Junk, Thomas R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2016-01-01

    The end products of experimental data analysis are designed to be simple and easy to understand: hypothesis tests and measurements of parameters. But, the experimental data themselves are voluminous and complex. Furthermore, in modern collider experiments, many petabytes of data must be processed in search of rare new processes which occur together with much more copious background processes that are of less interest to the task at hand. The systematic uncertainties on the background may be larger than the expected signal in many cases. The statistical power of an analysis and its sensitivity to systematic uncertainty can therefore usually both be improved by separating signal events from background events with higher efficiency and purity.

  6. Soil analysis. Modern instrumental technique

    International Nuclear Information System (INIS)

    Smith, K.A.

    1993-01-01

    This book covers traditional methods of analysis and specialist monographs on individual instrumental techniques, which are usually not written with soil or plant analysis specifically in mind. The principles of the techniques are combined with discussions of sample preparation and matrix problems, and critical reviews of applications in soil science and related disciplines. Individual chapters are processed separately for inclusion in the appropriate data bases

  7. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  8. Interactive reliability analysis project. FY 80 progress report

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Shepherd, J.C.

    1981-03-01

    This report summarizes the progress to date in the interactive reliability analysis project. Purpose is to develop and demonstrate a reliability and safety technique that can be incorporated early in the design process. Details are illustrated in a simple example of a reactor safety system

  9. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    Science.gov (United States)

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  10. Progress on acoustic techniques for LMFBR structural surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Burton, E J; Bentley, P G; McKnight, J A [RNL, UKAEA, Risley, Warrington, Cheshire (United Kingdom)

    1980-11-01

    Acoustic techniques are being developed to monitor remotely the incipient events of various modes of failure. Topics have been selected from the development programme which are either of special importance or in which significant advances have been made recently. Ultrasonic inspection of stainless steel welds is difficult and one alternative approach which is being explored is to identify manufacturing defects during fabrication by monitoring the welding processes. Preliminary measurements are described of the acoustic events measured during deliberately defective welding tests in the laboratory and some initial analysis using pattern recognition techniques is described. The assessment of structural failures using probability analysis has emphasised the potential value of continuous monitoring during operation and this has led to the investigation into the use of vibrational analysis and acoustic emission as monitoring techniques. Mechanical failure from fatigue may be anticipated from measurement of vibrational modes and experience from PFR and from models have indicated the depth of detailed understanding required to achieve this. In the laboratory a vessel with an artificial defect has been pressurised to failure. Detection of the weak stress wave emissions was possible but difficult and the prospects for on-line monitoring are discussed. Ultrasonic technology for providing images of components immersed in the opaque sodium of LMFBRs is being developed. Images are cormed by the physical scanning of a target using transducers in a pulse-echo mode. Lead zirconate transducers have been developed which can be deployed during reactor shut-down. The first application will be to examine a limited area of the core of PFR. Handling the data from such an experiment involves developing methods for reading and storing the information from such ultrasonic echo. Such techniques have been tested in real time by simulation in a water model. Methods of enhancing the images to be

  11. Progress in realistic LOCA analysis

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Ohkawa, K.

    2004-01-01

    In 1988 the USNRC revised the ECCS rule contained in Appendix K and Section 50.46 of 10 CFR Part 50, which governs the analysis of the Loss Of Coolant Accident (LOCA). The revised regulation allows the use of realistic computer models to calculate the loss of coolant accident. In addition, the new regulation allows the use of high probability estimates of peak cladding temperature (PCT), rather than upper bound estimates. Prior to this modification, the regulations were a prescriptive set of rules which defined what assumptions must be made about the plant initial conditions and how various physical processes should be modeled. The resulting analyses were highly conservative in their prediction of the performance of the ECCS, and placed tight constraints on core power distributions, ECCS set points and functional requirements, and surveillance and testing. These restrictions, if relaxed, will allow for additional economy, flexibility, and in some cases, improved reliability and safety as well. For example, additional economy and operating flexibility can be achieved by implementing several available core and fuel rod designs to increase fuel discharge burnup and reduce neutron flux on the reactor vessel. The benefits of application of best estimate methods to LOCA analyses have typically been associated with reductions in fuel costs, resulting from optimized fuel designs, or increased revenue from power upratings. Fuel cost savings are relatively easy to quantify, and have been estimated at several millions of dollars per cycle for an individual plant. Best estimate methods are also likely to contribute significantly to reductions in O and M costs, although these reductions are more difficult to quantify. Examples of O and M cost reductions are: 1) Delaying equipment replacement. With best estimate methods, LOCA is no longer a factor in limiting power levels for plants with high tube plugging levels or degraded safety injection systems. If other requirements for

  12. Mass Spectrometric C-14 Detection Techniques: Progress Report

    Science.gov (United States)

    Synal, H.

    2013-12-01

    Accelerator Mass Spectrometry (AMS) has been established as the best-suited radiocarbon detection technique. In the past years, significant progress with AMS instrumentation has been made resulting in a boom of new AMS facilities around the World. Today, carbon only AMS systems predominantly utilize 1+ charge state and molecule destruction in multiple ion gas collisions in stripper gas cell. This has made possible a significant simplification of the instruments, a reduction of ion energies and related to this less required space of the installations. However, state-of-the-art AMS instruments have still not reached a development stage where they can be regarded as table-top systems. In this respect, more development is needed to further advance the applicability of radiocarbon not only in the traditional fields of dating but also in biomedical research and new fields in Earth and environmental sciences. In a the proof-of-principle experiment the feasibility of radiocarbon detection over the entire range of dating applications was demonstrated using a pure mass spectrometer and ion energies below 50 keV. Now an experimental platform has been completed to test performance and to explore operation and measurement conditions of pure mass spectrometric radiocarbon detection. This contribution will overview the physical principles, which make this development possible and discuss key parameters of the instrumental design and performance of such an instrument.

  13. Probabilistic Accident Progression Analysis with application to a LMFBR design

    International Nuclear Information System (INIS)

    Jamali, K.M.

    1982-01-01

    A method for probabilistic analysis of accident sequences in nuclear power plant systems referred to as ''Probabilistic Accident Progression Analysis'' (PAPA) is described. Distinctive features of PAPA include: (1) definition and analysis of initiator-dependent accident sequences on the component level; (2) a new fault-tree simplification technique; (3) a new technique for assessment of the effect of uncertainties in the failure probabilities in the probabilistic ranking of accident sequences; (4) techniques for quantification of dependent failures of similar components, including an iterative technique for high-population components. The methodology is applied to the Shutdown Heat Removal System (SHRS) of the Clinch River Breeder Reactor Plant during its short-term (0 -2 . Major contributors to this probability are the initiators loss of main feedwater system, loss of offsite power, and normal shutdown

  14. Bulk analysis using nuclear techniques

    International Nuclear Information System (INIS)

    Borsaru, M.; Holmes, R.J.; Mathew, P.J.

    1983-01-01

    Bulk analysis techniques developed for the mining industry are reviewed. Using penetrating neutron and #betta#-radiations, measurements are obtained directly from a large volume of sample (3-30 kg) #betta#-techniques were used to determine the grade of iron ore and to detect shale on conveyor belts. Thermal neutron irradiation was developed for the simultaneous determination of iron and aluminium in iron ore on a conveyor belt. Thermal-neutron activation analysis includes the determination of alumina in bauxite, and manganese and alumina in manganese ore. Fast neutron activation analysis is used to determine silicon in iron ores, and alumina and silica in bauxite. Fast and thermal neutron activation has been used to determine the soil in shredded sugar cane. (U.K.)

  15. Progress in spatial analysis methods and applications

    CERN Document Server

    Páez, Antonio; Buliung, Ron N; Dall'erba, Sandy

    2010-01-01

    This book brings together developments in spatial analysis techniques, including spatial statistics, econometrics, and spatial visualization, and applications to fields such as regional studies, transportation and land use, population and health.

  16. Glaucoma Monitoring in a Clinical Setting Glaucoma Progression Analysis vs Nonparametric Progression Analysis in the Groningen Longitudinal Glaucoma Study

    NARCIS (Netherlands)

    Wesselink, Christiaan; Heeg, Govert P.; Jansonius, Nomdo M.

    Objective: To compare prospectively 2 perimetric progression detection algorithms for glaucoma, the Early Manifest Glaucoma Trial algorithm (glaucoma progression analysis [GPA]) and a nonparametric algorithm applied to the mean deviation (MD) (nonparametric progression analysis [NPA]). Methods:

  17. Progress of electromagnetic analysis for fusion reactors

    International Nuclear Information System (INIS)

    Takagi, T.; Ruatto, P.; Boccaccini, L.V.

    1998-01-01

    This paper describes the recent progress of electromagnetic analysis research for fusion reactors including methods, codes, verification tests and some applications. Due to the necessity of the research effort for the structural design of large tokamak devices since the 1970's with the help of the introduction of new numerical methods and the advancement of computer technologies, three-dimensional analysis methods have become as practical as shell approximation methods. The electromagnetic analysis is now applied to the structural design of new fusion reactors. Some more modeling and verification tests are necessary when the codes are applied to new materials with nonlinear material properties. (orig.)

  18. Evolutionary Game Theory Analysis of Tumor Progression

    Science.gov (United States)

    Wu, Amy; Liao, David; Sturm, James; Austin, Robert

    2014-03-01

    Evolutionary game theory applied to two interacting cell populations can yield quantitative prediction of the future densities of the two cell populations based on the initial interaction terms. We will discuss how in a complex ecology that evolutionary game theory successfully predicts the future densities of strains of stromal and cancer cells (multiple myeloma), and discuss the possible clinical use of such analysis for predicting cancer progression. Supported by the National Science Foundation and the National Cancer Institute.

  19. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  20. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  1. Progress in radiation protection techniques for workers in the nuclear industry

    International Nuclear Information System (INIS)

    Pradel, J.; Zettwoog, P.; Rouyer, J.L.

    1982-01-01

    The increasingly stringent safety requirements of workers and the general public in the face of occupational and in particular nuclear risks call for continual improvements in radiation protection techniques. The Institute of Protection and Nuclear Safety (IPSN), especially the Technical Protection Services belonging to the Protection Department, and also the various radiation protection services of the French Atomic Energy Commission's nuclear centres and Electricite de France (EDF) are carrying out substantial research and development programmes on the subject. For this reason, IPSN organized a specialists' meeting to take stock of the efforts being made and to try to identify what steps seem most promising or should have priority at the national level. The authors summarize the presentations and discussions on three topics: (1) Progress in the analysis of the mechanism of exposure of workers; (2) Progress achieved from the radiation protection standpoint in the field of facility design and instrumentation; and (3) Application of the optimization principle

  2. Investigation progress of imaging techniques monitoring stem cell therapy

    International Nuclear Information System (INIS)

    Wu Jun; An Rui

    2006-01-01

    Recently stem cell therapy has showed potential clinical application in diabetes mellitus, cardiovascular diseases, malignant tumor and trauma. Efficient techniques of non-invasively monitoring stem cell transplants will accelerate the development of stem cell therapies. This paper briefly reviews the clinical practice of stem cell, in addition, makes a review of monitoring methods including magnetic resonance and radionuclide imaging which have been used in stem cell therapy. (authors)

  3. Recent Progress in Electrical Insulation Techniques for HTS Power Apparatus

    Science.gov (United States)

    Hayakawa, Naoki; Kojima, Hiroki; Hanai, Masahiro; Okubo, Hitoshi

    This paper describes the electrical insulation techniques at cryogenic temperatures, i.e. Cryodielectrics, for HTS power apparatus, e.g. HTS power transmission cables, transformers, fault current limiters and SMES. Breakdown and partial discharge characteristics are discussed for different electrical insulation configurations of LN2, sub-cooled LN2, solid, vacuum and their composite insulation systems. Dynamic and static insulation performances with and without taking account of quench in HTS materials are also introduced.

  4. The progress in radiotherapy techniques and it's clinical implications

    International Nuclear Information System (INIS)

    Reinfuss, M.; Walasek, T.; Byrski, E.; Blecharz, P.

    2011-01-01

    Three modem radiotherapy techniques were introduced into clinical practice at the onset of the 21 st century - stereotactic radiation therapy (SRT), proton therapy and carbon-ion radiotherapy. Our paper summarizes the basic principles of physics, as well as the technical reqirements and clinical indications for those techniques. SRT is applied for intracranial diseases (arteriovenous malformations, acoustic nerve neuromas, brain metastases, skull base tumors) and in such cases it is referred to as stereotactic radiosurgery (SRS). Techniques used during SRS include GammaKnife, CyberKnife and dedicated linacs. SRT can also be applied for extracranial disease (non-small cell lung cancer, lung metastases, spinal and perispinal tumors, primary liver tumors, breast cancer, pancreatic tumors, prostate cancer, head and neck tumors) and in such cases it is referred to as stereotactic body radiation therapy (SBRT). Eye melanomas, skull base and cervical spine chordomas and chordosarcomas, as well as childhood neoplasms, are considered to be the classic indications for proton therapy. Clinical trials are currently conducted to investigate the usefulness of proton beam in therapy of non-small cell lung cancer, prostate cancer, head and neck tumors, primary liver and oesophageal cancer Carbon-ion radiotherapy is presumed to be more advantageous than proton therapy because of its higher relative biological effectiveness (RBE) and possibility of real-time control of the irradiated volume under PET visualization. The basic indications for carbon-ion therapy are salivary glands neoplasms, selected types of soft tissue and bone sarcomas, skull base chordomas and chordosarcomas, paranasal sinus neoplasms, primary liver cancers and inoperable rectal adenocarcinoma recurrences. (authors)

  5. Work in progress. Transcatheter thermal venous occlusion: a new technique

    International Nuclear Information System (INIS)

    Rholl, K.S.; Rysavy, J.A.; Vlodaver, Z.; Cragg, A.H.; Castaneda-Zuniga, W.R.; Amplatz, K.

    1982-01-01

    Diatrizoate (76%) contrast agent heated to 100 0 C was injected into the veins of dogs and one human volunteer for the nonsurgical occlusion of the vessels. Follow-up venograms and histologic examinations, at intervals varying from one day to four weeks later, revealed thrombosis of the injected veins in all animals. Thrombosis occurred one to five days after injection of contrast agent. The authors conclude that hot contrast medium is a safe and convenient agent for inducing thrombosis. It is much easier to use than mechanical devices, tissue glues, and plastics, which involve complex procedures and specialized equipment. In contrast to other sclerosing agents, hot contrast agent is rapidly converted into a nonsclerosing agent by cooling. The new technique allows a more controlled thremal injury to the vascular wall and is under fluoroscopic control

  6. Resonance ionization of sputtered atoms: Progress toward a quantitative technique

    International Nuclear Information System (INIS)

    Calaway, W.F.; Pellin, M.J.; Young, C.E.; Whitten, J.E.; Gruen, D.M.; Coon, S.R.; Texas Univ., Austin, TX; Wiens, R.C.; Burnett, D.S.; Stingeder, G.; Grasserbauer, M.

    1992-01-01

    The combination of RIMS and ion sputtering has been heralded as the ideal means of quantitatively probing the surface of a solid. While several laboratories have demonstrated the extreme sensitivity of combining RIMS with sputtering, less effort has been devoted to the question of accuracy. Using the SARISA instrument developed at Argonne National Laboratory, a number of well-characterized metallic samples have been analyzed. Results from these determinations have been compared with data obtained by several other analytical methods. One significant finding is that impurity measurements down to ppb levels in metal matrices can be made quantitative by employing polycrystalline metal foils as calibration standards. This discovery substantially reduces the effort required for quantitative analysis since a single standard can be used for determining concentrations spanning nine orders of magnitude

  7. Progress in the development of a video-based wind farm simulation technique

    OpenAIRE

    Robotham, AJ

    1992-01-01

    The progress in the development of a video-based wind farm simulation technique is reviewed. While improvements have been achieved in the quality of the composite picture created by combining computer generated animation sequences of wind turbines with background scenes of the wind farm site, extending the technique to include camera movements has proved troublesome.

  8. Reliability analysis techniques in power plant design

    International Nuclear Information System (INIS)

    Chang, N.E.

    1981-01-01

    An overview of reliability analysis techniques is presented as applied to power plant design. The key terms, power plant performance, reliability, availability and maintainability are defined. Reliability modeling, methods of analysis and component reliability data are briefly reviewed. Application of reliability analysis techniques from a design engineering approach to improving power plant productivity is discussed. (author)

  9. [Research Progress of Vitreous Humor Detection Technique on Estimation of Postmortem Interval].

    Science.gov (United States)

    Duan, W C; Lan, L M; Guo, Y D; Zha, L; Yan, J; Ding, Y J; Cai, J F

    2018-02-01

    Estimation of postmortem interval (PMI) plays a crucial role in forensic study and identification work. Because of the unique anatomy location, vitreous humor is considered to be used for estima- ting PMI, which has aroused interest among scholars, and some researches have been carried out. The detection techniques of vitreous humor are constantly developed and improved which have been gradually applied in forensic science, meanwhile, the study of PMI estimation using vitreous humor is updated rapidly. This paper reviews various techniques and instruments applied to vitreous humor detection, such as ion selective electrode, capillary ion analysis, spectroscopy, chromatography, nano-sensing technology, automatic biochemical analyser, flow cytometer, etc., as well as the related research progress on PMI estimation in recent years. In order to provide a research direction for scholars and promote a more accurate and efficient application in PMI estimation by vitreous humor analysis, some inner problems are also analysed in this paper. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  10. The latest progress of TILLING technique and its prospects in irradiation mutation breeding

    International Nuclear Information System (INIS)

    Du Yan; Yu Lixia; Liu Qingfang; Zhou Libin; Li Wenjian

    2011-01-01

    TILLING (Targeting Induced Local Lesions IN Genomes) is a novel, high-throughput and low-cost reverse genetics technique. In recent years, with innovation of the mutation screening techniques, TILLING platform has become more diversified, which makes the operation of TILLING technique more simple and rapid. For this reason, it is widely used in crop breeding research. In this paper, we summarized the latest progress of TILLING technique, meanwhile, we also discussed the prospect of combining irradiation mutation with the high-throughput TILLING technique in mutation breeding. (authors)

  11. Nuclear analysis techniques and environmental sciences

    International Nuclear Information System (INIS)

    1997-10-01

    31 theses are collected in this book. It introduced molecular activation analysis micro-PIXE and micro-probe analysis, x-ray fluorescence analysis and accelerator mass spectrometry. The applications about these nuclear analysis techniques are presented and reviewed for environmental sciences

  12. Analysis of archaeological pieces with nuclear techniques

    International Nuclear Information System (INIS)

    Tenorio, D.

    2002-01-01

    In this work nuclear techniques such as Neutron Activation Analysis, PIXE, X-ray fluorescence analysis, Metallography, Uranium series, Rutherford Backscattering for using in analysis of archaeological specimens and materials are described. Also some published works and thesis about analysis of different Mexican and Meso american archaeological sites are referred. (Author)

  13. Chemical analysis by nuclear techniques

    International Nuclear Information System (INIS)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y.

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system

  14. Chemical analysis by nuclear techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sohn, S. C.; Kim, W. H.; Park, Y. J.; Song, B. C.; Jeon, Y. S.; Jee, K. Y.; Pyo, H. Y

    2002-01-01

    This state art report consists of four parts, production of micro-particles, analysis of boron, alpha tracking method and development of neutron induced prompt gamma ray spectroscopy (NIPS) system. The various methods for the production of micro-paticles such as mechanical method, electrolysis method, chemical method, spray method were described in the first part. The second part contains sample treatment, separation and concentration, analytical method, and application of boron analysis. The third part contains characteristics of alpha track, track dectectors, pretreatment of sample, neutron irradiation, etching conditions for various detectors, observation of track on the detector, etc. The last part contains basic theory, neutron source, collimator, neutron shields, calibration of NIPS, and application of NIPS system.

  15. Nonlinear Progressive Collapse Analysis Including Distributed Plasticity

    Directory of Open Access Journals (Sweden)

    Mohamed Osama Ahmed

    2016-01-01

    Full Text Available This paper demonstrates the effect of incorporating distributed plasticity in nonlinear analytical models used to assess the potential for progressive collapse of steel framed regular building structures. Emphasis on this paper is on the deformation response under the notionally removed column, in a typical Alternate Path (AP method. The AP method employed in this paper is based on the provisions of the Unified Facilities Criteria – Design of Buildings to Resist Progressive Collapse, developed and updated by the U.S. Department of Defense [1]. The AP method is often used for to assess the potential for progressive collapse of building structures that fall under Occupancy Category III or IV. A case study steel building is used to examine the effect of incorporating distributed plasticity, where moment frames were used on perimeter as well as the interior of the three dimensional structural system. It is concluded that the use of moment resisting frames within the structural system will enhance resistance to progressive collapse through ductile deformation response and that it is conserative to ignore the effects of distributed plasticity in determining peak displacement response under the notionally removed column.

  16. Systems analysis department annual progress report 1986

    International Nuclear Information System (INIS)

    Grohnheit, P.E.; Larsen, H.; Vestergaard, N.K.

    1987-02-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1986. The activities may be classified as energy systems analysis and risk and reliability analysis. The report includes a list of staff members. (author)

  17. Visual Progression Analysis of Student Records Data

    OpenAIRE

    Raji, Mohammad; Duggan, John; DeCotes, Blaise; Huang, Jian; Zanden, Bradley Vander

    2017-01-01

    University curriculum, both on a campus level and on a per-major level, are affected in a complex way by many decisions of many administrators and faculty over time. As universities across the United States share an urgency to significantly improve student success and success retention, there is a pressing need to better understand how the student population is progressing through the curriculum, and how to provide better supporting infrastructure and refine the curriculum for the purpose of ...

  18. Event tree analysis using artificial intelligence techniques

    International Nuclear Information System (INIS)

    Dixon, B.W.; Hinton, M.F.

    1985-01-01

    Artificial Intelligence (AI) techniques used in Expert Systems and Object Oriented Programming are discussed as they apply to Event Tree Analysis. A SeQUence IMPortance calculator, SQUIMP, is presented to demonstrate the implementation of these techniques. Benefits of using AI methods include ease of programming, efficiency of execution, and flexibility of application. The importance of an appropriate user interface is stressed. 5 figs

  19. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  20. Statistical evaluation of vibration analysis techniques

    Science.gov (United States)

    Milner, G. Martin; Miller, Patrice S.

    1987-01-01

    An evaluation methodology is presented for a selection of candidate vibration analysis techniques applicable to machinery representative of the environmental control and life support system of advanced spacecraft; illustrative results are given. Attention is given to the statistical analysis of small sample experiments, the quantification of detection performance for diverse techniques through the computation of probability of detection versus probability of false alarm, and the quantification of diagnostic performance.

  1. Single-molecule techniques in biophysics: a review of the progress in methods and applications

    Science.gov (United States)

    Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J. M.; Leake, Mark C.

    2018-02-01

    Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in ‘force spectroscopy’ techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including

  2. Constrained principal component analysis and related techniques

    CERN Document Server

    Takane, Yoshio

    2013-01-01

    In multivariate data analysis, regression techniques predict one set of variables from another while principal component analysis (PCA) finds a subspace of minimal dimensionality that captures the largest variability in the data. How can regression analysis and PCA be combined in a beneficial way? Why and when is it a good idea to combine them? What kind of benefits are we getting from them? Addressing these questions, Constrained Principal Component Analysis and Related Techniques shows how constrained PCA (CPCA) offers a unified framework for these approaches.The book begins with four concre

  3. Risk Analysis Group annual progress report 1984

    International Nuclear Information System (INIS)

    1985-06-01

    The activities of the Risk Analysis Group at Risoe during 1984 are presented. These include descriptions in some detail of work on general development topics and risk analysis performed as contractor. (author)

  4. THEMATIC PROGRESSION PATTERN : A TECHNIQUE TO IMPROVE STUDENTS’ WRITING SKILL VIEWED FROM WRITING APPREHENSION

    Directory of Open Access Journals (Sweden)

    Fitri Nurdianingsih

    2017-10-01

    Full Text Available The objective of conducting this research was to find out : (1 whether or not the use of thematic progression pattern is more effective than direct instruction in teaching writing to the second semester students at English Education Department; (2 the students who have a low writing apprehension have better writing skill than those who have a high writng apprehension; and (3 there is an interaction between teaching technique and writing apprehension in teaching writing skill. This reasearch was an experimental research design. The population of this research was the second semester students at English Education Department of IKIP PGRI Bojonegoro. Meanwhile the sample of this research was selected by using cluster random sampling. The instruments of data collection were witing test and writing apprehension questionnaire. The findings of this study are: (1 thematic progression pattern is more effective than direct instruction in teaching writing; (2 the students who have low writing apprehension have better writing skill than those who have high writing apprehension; and (3 there is an interaction between teaching technique and writing apprehension in teaching writing skill. It can be summarized that thematic progression pattern is an effective technique in teaching writing skill at the second semester students of English Education Department in IKIP PGRI Bojonegoro. The effectiveness of the technique is affected by writing apprehension.

  5. Elemental analysis techniques using proton microbeam

    International Nuclear Information System (INIS)

    Sakai, Takuro; Oikawa, Masakazu; Sato, Takahiro

    2005-01-01

    Proton microbeam is a powerful tool for two-dimensional elemental analysis. The analysis is based on Particle Induced X-ray Emission (PIXE) and Particle Induced Gamma-ray Emission (PIGE) techniques. The paper outlines the principles and instruments, and describes the dental application has been done in JAERI Takasaki. (author)

  6. Systems Analysis Department annual progress report 1998

    DEFF Research Database (Denmark)

    1999-01-01

    The report describes the work of the Systems Analysis Department at Risø National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, IndustrialSafety and Reliability, Man/Machine Interac....../Machine Interaction, and Technology Scenarios. The report includes lists of publications, lectures, committees and staff members....

  7. Systems Analysis Department annual progress report 1999

    DEFF Research Database (Denmark)

    2000-01-01

    This report describes the work of the Systems Analysis Department at Risø National Laboratory during 1999. The department is undertaking research within Energy Systems Analysis, Energy, Environment and Development Planning - UNEP Centre, Safety,Realiability and Human Factors, and Technology...

  8. Systems Analysis Department. Annual Progress Report 1999

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    2000-03-01

    This report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1999. The department is undertaking research within Energy Systems Analysis, Energy, Environment and Development Planning-UNEP Centre, Safety, Reliability and Human Factors, and Technology Scenarios. The report includes summary statistics and lists of publications, committees and staff members. (au)

  9. Systems Analysis department. Annual progress report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Petersen, Kurt E

    1998-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1997. The department is undertaking research within Energy systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability and Man/Machine Interaction. The report includes lists of publications lectures, committees and staff members. (au) 110 refs.

  10. Systems Analysis Department annual progress report 1998

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, Hans; Olsson, Charlotte; Loevborg, Leif [eds.

    1999-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1998. The department undertakes research within Energy Systems Analysis, Integrated Energy, Environment and Development Planning - UNEP Centre, Industrial Safety and Reliability, Man/Machine Interaction and Technology Scenarios. The report includes lists of publications, lectures, committees and staff members. (au) 111 refs.

  11. Progress report on the AMT analysis

    International Nuclear Information System (INIS)

    Wood, B.

    1992-01-01

    ICF Resources Incorporated's analysis of the Alternative Minimum Tax (AMT) has examined its effect on the US oil and gas industry from several different perspectives, to estimate the effect of the three relief proposals and to better understand the source of the outcry about the AMTs ''inequities.'' This report is a brief summary of the methodology and results to date. The complexity of the accounting mechanisms that comprise the AMT and the disparity between this analytical conclusions and clauses made by the oil and gas industry (principally the IPAA) have led this analysis through several distinct phases of: Project-level analysis; firm-level analysis; and demographic analysis. These analyses are described in detail

  12. Techniques for sensitivity analysis of SYVAC results

    International Nuclear Information System (INIS)

    Prust, J.O.

    1985-05-01

    Sensitivity analysis techniques may be required to examine the sensitivity of SYVAC model predictions to the input parameter values, the subjective probability distributions assigned to the input parameters and to the relationship between dose and the probability of fatal cancers plus serious hereditary disease in the first two generations of offspring of a member of the critical group. This report mainly considers techniques for determining the sensitivity of dose and risk to the variable input parameters. The performance of a sensitivity analysis technique may be improved by decomposing the model and data into subsets for analysis, making use of existing information on sensitivity and concentrating sampling in regions the parameter space that generates high doses or risks. A number of sensitivity analysis techniques are reviewed for their application to the SYVAC model including four techniques tested in an earlier study by CAP Scientific for the SYVAC project. This report recommends the development now of a method for evaluating the derivative of dose and parameter value and extending the Kruskal-Wallis technique to test for interactions between parameters. It is also recommended that the sensitivity of the output of each sub-model of SYVAC to input parameter values should be examined. (author)

  13. Flow analysis techniques for phosphorus: an overview.

    Science.gov (United States)

    Estela, José Manuel; Cerdà, Víctor

    2005-04-15

    A bibliographical review on the implementation and the results obtained in the use of different flow analytical techniques for the determination of phosphorus is carried out. The sources, occurrence and importance of phosphorus together with several aspects regarding the analysis and terminology used in the determination of this element are briefly described. A classification as well as a brief description of the basis, advantages and disadvantages of the different existing flow techniques, namely; segmented flow analysis (SFA), flow injection analysis (FIA), sequential injection analysis (SIA), all injection analysis (AIA), batch injection analysis (BIA), multicommutated FIA (MCFIA), multisyringe FIA (MSFIA) and multipumped FIA (MPFIA) is also carried out. The most relevant manuscripts regarding the analysis of phosphorus by means of flow techniques are herein classified according to the detection instrumental technique used with the aim to facilitate their study and obtain an overall scope. Finally, the analytical characteristics of numerous flow-methods reported in the literature are provided in the form of a table and their applicability to samples with different matrixes, namely water samples (marine, river, estuarine, waste, industrial, drinking, etc.), soils leachates, plant leaves, toothpaste, detergents, foodstuffs (wine, orange juice, milk), biological samples, sugars, fertilizer, hydroponic solutions, soils extracts and cyanobacterial biofilms are tabulated.

  14. Quality assurance techniques for activation analysis

    International Nuclear Information System (INIS)

    Becker, D.A.

    1984-01-01

    The principles and techniques of quality assurance are applied to the measurement method of activation analysis. Quality assurance is defined to include quality control and quality assessment. Plans for quality assurance include consideration of: personnel; facilities; analytical design; sampling and sample preparation; the measurement process; standards; and documentation. Activation analysis concerns include: irradiation; chemical separation; counting/detection; data collection, and analysis; and calibration. Types of standards discussed include calibration materials and quality assessment materials

  15. Progress in nuclear measuring and experimental techniques by application of microelectronics. 1

    International Nuclear Information System (INIS)

    Meiling, W.

    1984-01-01

    In the past decade considerable progress has been made in nuclear measuring and experimental techniques by developing position-sensitive detector systems and widely using integrated circuits and microcomputers for data acquisition and processing as well as for automation of measuring processes. In this report which will be published in three parts those developments are reviewed and demonstrated on selected examples. After briefly characterizing microelectronics, the use of microelectronic elements for radiation detectors is reviewed. (author)

  16. A numerical technique for reactor subchannel analysis

    International Nuclear Information System (INIS)

    Fath, Hassan E.S.

    1983-01-01

    A numerical technique is developed for the solution of the transient boundary layer equations with a moving liquid-vapour interface boundary. The technique uses the finite difference method with the velocity components defined over an Eulerian mesh. A system of interface massless markers is defined where the markers move with the flow field according to a simple kinematic relation between the interface geometry and the fluid velocity. Different applications of nuclear engineering interest are reported with some available results. The present technique is capable of predicting the interface profile near the wall which is important in the reactor subchannel analysis

  17. Systems Analysis Department. Annual progress report 1996

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, H; Olsson, C; Petersen, K E [eds.

    1997-03-01

    The report describes the work of the Systems Analysis Department at Risoe National Laboratory during 1996. The department is undertaking research within Simulation and Optimisation of Energy Systems, Energy and Environment in Developing Countries - UNEP Centre, Integrated Environmental and Risk Management and Man/Machine Interaction. The report includes lists of publications, lectures, committees and staff members. (au) 131 refs.

  18. Dynamics and vibrations progress in nonlinear analysis

    CERN Document Server

    Kachapi, Seyed Habibollah Hashemi

    2014-01-01

    Dynamical and vibratory systems are basically an application of mathematics and applied sciences to the solution of real world problems. Before being able to solve real world problems, it is necessary to carefully study dynamical and vibratory systems and solve all available problems in case of linear and nonlinear equations using analytical and numerical methods. It is of great importance to study nonlinearity in dynamics and vibration; because almost all applied processes act nonlinearly, and on the other hand, nonlinear analysis of complex systems is one of the most important and complicated tasks, especially in engineering and applied sciences problems. There are probably a handful of books on nonlinear dynamics and vibrations analysis. Some of these books are written at a fundamental level that may not meet ambitious engineering program requirements. Others are specialized in certain fields of oscillatory systems, including modeling and simulations. In this book, we attempt to strike a balance between th...

  19. Gold analysis by the gamma absorption technique

    International Nuclear Information System (INIS)

    Kurtoglu, Arzu; Tugrul, A.B.

    2003-01-01

    Gold (Au) analyses are generally performed using destructive techniques. In this study, the Gamma Absorption Technique has been employed for gold analysis. A series of different gold alloys of known gold content were analysed and a calibration curve was obtained. This curve was then used for the analysis of unknown samples. Gold analyses can be made non-destructively, easily and quickly by the gamma absorption technique. The mass attenuation coefficients of the alloys were measured around the K-shell absorption edge of Au. Theoretical mass attenuation coefficient values were obtained using the WinXCom program and comparison of the experimental results with the theoretical values showed generally good and acceptable agreement

  20. Sensitivity analysis of hybrid thermoelastic techniques

    Science.gov (United States)

    W.A. Samad; J.M. Considine

    2017-01-01

    Stress functions have been used as a complementary tool to support experimental techniques, such as thermoelastic stress analysis (TSA) and digital image correlation (DIC), in an effort to evaluate the complete and separate full-field stresses of loaded structures. The need for such coupling between experimental data and stress functions is due to the fact that...

  1. Organic analysis progress report FY 1997

    Energy Technology Data Exchange (ETDEWEB)

    Clauss, S.A.; Grant, K.E.; Hoopes, V.; Mong, G.M.; Steele, R.; Bellofatto, D.; Sharma, A.

    1998-04-01

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples.

  2. Organic analysis progress report FY 1997

    International Nuclear Information System (INIS)

    Clauss, S.A.; Grant, K.E.; Hoopes, V.; Mong, G.M.; Steele, R.; Bellofatto, D.; Sharma, A.

    1998-04-01

    The Organic Analysis and Methods Development Task is being conducted by Pacific Northwest National Laboratory (PNNL) as part of the Organic Tank Waste Safety Project. The objective of the task is to apply developed analytical methods to identify and/or quantify the amount of particular organic species in tank wastes. In addition, this task provides analytical support for the Gas Generation Studies Task, Waste Aging, and Solubility Studies. This report presents the results from analyses of tank waste samples archived at Pacific Northwest National Laboratory (PNNL) and received from the Project Hanford Management Contractor (PHMC), which included samples associated with both the Flammable Gas and Organic Tank Waste Safety Programs. The data are discussed in Section 2.0. In addition, the results of analytical support for analyzing (1) simulated wastes for Waste Aging, (2) tank waste samples for Gas Generation, and (3) simulated wastes associated with solubility studies discussed in Sections 3.0, 4.0, and 5.0, respectively. The latter part of FY 1997 was devoted to documenting the analytical procedures, including derivation gas chromatography/mass spectrometry (GC/MS) and GC/FID for quantitation, ion-pair chromatography (IPC), IC, and the cation exchange procedure for reducing the radioactivity of samples. The documentation of analytical procedures is included here and discussed in Section 6.0 and Section 7.0 discusses other analytical procedures. The references are listed in Section 8.0 and future plans are discussed in Section 9.0. Appendix A is a preprint of a manuscript accepted for publication. Appendix B contains the cc mail messages and chain-of-custody forms for the samples received for analyses. Appendix C contains the test plan for analysis of tank waste samples

  3. Microextraction sample preparation techniques in biomedical analysis.

    Science.gov (United States)

    Szultka, Malgorzata; Pomastowski, Pawel; Railean-Plugaru, Viorica; Buszewski, Boguslaw

    2014-11-01

    Biologically active compounds are found in biological samples at relatively low concentration levels. The sample preparation of target compounds from biological, pharmaceutical, environmental, and food matrices is one of the most time-consuming steps in the analytical procedure. The microextraction techniques are dominant. Metabolomic studies also require application of proper analytical technique for the determination of endogenic metabolites present in biological matrix on trace concentration levels. Due to the reproducibility of data, precision, relatively low cost of the appropriate analysis, simplicity of the determination, and the possibility of direct combination of those techniques with other methods (combination types on-line and off-line), they have become the most widespread in routine determinations. Additionally, sample pretreatment procedures have to be more selective, cheap, quick, and environmentally friendly. This review summarizes the current achievements and applications of microextraction techniques. The main aim is to deal with the utilization of different types of sorbents for microextraction and emphasize the use of new synthesized sorbents as well as to bring together studies concerning the systematic approach to method development. This review is dedicated to the description of microextraction techniques and their application in biomedical analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. CRDM motion analysis using machine learning technique

    International Nuclear Information System (INIS)

    Nishimura, Takuya; Nakayama, Hiroyuki; Saitoh, Mayumi; Yaguchi, Seiji

    2017-01-01

    Magnetic jack type Control Rod Drive Mechanism (CRDM) for pressurized water reactor (PWR) plant operates control rods in response to electrical signals from a reactor control system. CRDM operability is evaluated by quantifying armature's response of closed/opened time which means interval time between coil energizing/de-energizing points and armature closed/opened points. MHI has already developed an automatic CRDM motion analysis and applied it to actual plants so far. However, CRDM operational data has wide variation depending on their characteristics such as plant condition, plant, etc. In the existing motion analysis, there is an issue of analysis accuracy for applying a single analysis technique to all plant conditions, plants, etc. In this study, MHI investigated motion analysis using machine learning (Random Forests) which is flexibly accommodated to CRDM operational data with wide variation, and is improved analysis accuracy. (author)

  5. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  6. Progress of nuclide tracing technique in the study of soil erosion in recent decade

    International Nuclear Information System (INIS)

    Liu Gang; Yang Mingyi; Liu Puling; Tian Junliang

    2007-01-01

    In the last decade nuclide tracing technique has been widely employed in the investigation of soil erosion, which makes the studies of soil erosion into a new and rapid development period. This paper tried to review the recent progress of using 137 Cs, 210 Pb ex , 7 Be, composite tracers and REE-INAA in soil erosion rate, sedimentation rate, sediment source and soil erosion processes study, and also the existing research results. The trends for future development and questions are also discussed. (authors)

  7. Diffraction analysis of customized illumination technique

    Science.gov (United States)

    Lim, Chang-Moon; Kim, Seo-Min; Eom, Tae-Seung; Moon, Seung Chan; Shin, Ki S.

    2004-05-01

    Various enhancement techniques such as alternating PSM, chrome-less phase lithography, double exposure, etc. have been considered as driving forces to lead the production k1 factor towards below 0.35. Among them, a layer specific optimization of illumination mode, so-called customized illumination technique receives deep attentions from lithographers recently. A new approach for illumination customization based on diffraction spectrum analysis is suggested in this paper. Illumination pupil is divided into various diffraction domains by comparing the similarity of the confined diffraction spectrum. Singular imaging property of individual diffraction domain makes it easier to build and understand the customized illumination shape. By comparing the goodness of image in each domain, it was possible to achieve the customized shape of illumination. With the help from this technique, it was found that the layout change would not gives the change in the shape of customized illumination mode.

  8. Fault tree analysis: concepts and techniques

    International Nuclear Information System (INIS)

    Fussell, J.B.

    1976-01-01

    Concepts and techniques of fault tree analysis have been developed over the past decade and now predictions from this type analysis are important considerations in the design of many systems such as aircraft, ships and their electronic systems, missiles, and nuclear reactor systems. Routine, hardware-oriented fault tree construction can be automated; however, considerable effort is needed in this area to get the methodology into production status. When this status is achieved, the entire analysis of hardware systems will be automated except for the system definition step. Automated analysis is not undesirable; to the contrary, when verified on adequately complex systems, automated analysis could well become a routine analysis. It could also provide an excellent start for a more in-depth fault tree analysis that includes environmental effects, common mode failure, and human errors. The automated analysis is extremely fast and frees the analyst from the routine hardware-oriented fault tree construction, as well as eliminates logic errors and errors of oversight in this part of the analysis. Automated analysis then affords the analyst a powerful tool to allow his prime efforts to be devoted to unearthing more subtle aspects of the modes of failure of the system

  9. Applications of neutron activation analysis technique

    International Nuclear Information System (INIS)

    Jonah, S. A.

    2000-07-01

    The technique was developed as far back as 1936 by G. Hevesy and H. Levy for the analysis of Dy using an isotopic source. Approximately 40 elements can be analyzed by instrumental neutron activation analysis (INNA) technique with neutrons from a nuclear reactor. By applying radiochemical separation, the number of elements that can be analysed may be increased to almost 70. Compared with other analytical methods used in environmental and industrial research, NAA has some unique features. These are multi-element capability, rapidity, reproducibility of results, complementarity to other methods, freedom from analytical blank and independency of chemical state of elements. There are several types of neutron sources namely: nuclear reactors, accelerator-based and radioisotope-based sources, but nuclear reactors with high fluxes of neutrons from the fission of 235 U give the most intense irradiation, and hence the highest available sensitivities for NAA. In this paper, the applications of NAA of socio-economic importance are discussed. The benefits of using NAA and related nuclear techniques for on-line applications in industrial process control are highlighted. A brief description of the NAA set-ups at CERT is enumerated. Finally, NAA is compared with other leading analytical techniques

  10. Chromatographic Techniques for Rare Earth Elements Analysis

    Science.gov (United States)

    Chen, Beibei; He, Man; Zhang, Huashan; Jiang, Zucheng; Hu, Bin

    2017-04-01

    The present capability of rare earth element (REE) analysis has been achieved by the development of two instrumental techniques. The efficiency of spectroscopic methods was extraordinarily improved for the detection and determination of REE traces in various materials. On the other hand, the determination of REEs very often depends on the preconcentration and separation of REEs, and chromatographic techniques are very powerful tools for the separation of REEs. By coupling with sensitive detectors, many ambitious analytical tasks can be fulfilled. Liquid chromatography is the most widely used technique. Different combinations of stationary phases and mobile phases could be used in ion exchange chromatography, ion chromatography, ion-pair reverse-phase chromatography and some other techniques. The application of gas chromatography is limited because only volatile compounds of REEs can be separated. Thin-layer and paper chromatography are techniques that cannot be directly coupled with suitable detectors, which limit their applications. For special demands, separations can be performed by capillary electrophoresis, which has very high separation efficiency.

  11. Development and application of the electrochemical etching technique. Annual progress report

    International Nuclear Information System (INIS)

    1980-08-01

    This annual progress report documents further advances in the development and application of electrochemical etching of polycarbonate foils (ECEPF) for fast, intermediate, and thermal neutron dosimetry as well as alpha particle dosimetry. The fast (> 1.1 MeV) and thermal neutron dosimetry techniques were applied to a thorough investigation of the neutron contamination inherent in and about the primary x-ray beam of several medical therapy electron accelerators. Because of the small size of ECEPF dosimeters in comparison to other neutron meters, they have an unusually low perturbation of the radiation field under measurement. Due to this small size and the increased sensitivity of the ECEPF dosimeter over current techniques of measuring neutrons in a high photon field, the fast neutron contamination in the primary x-ray beam of all the investigated accelerators was measured with precision and found to be greater than that suggested by the other, more common, neutron dosimetry methods

  12. Development and application of the electrochemical etching technique. Annual progress report

    Energy Technology Data Exchange (ETDEWEB)

    1980-08-01

    This annual progress report documents further advances in the development and application of electrochemical etching of polycarbonate foils (ECEPF) for fast, intermediate, and thermal neutron dosimetry as well as alpha particle dosimetry. The fast (> 1.1 MeV) and thermal neutron dosimetry techniques were applied to a thorough investigation of the neutron contamination inherent in and about the primary x-ray beam of several medical therapy electron accelerators. Because of the small size of ECEPF dosimeters in comparison to other neutron meters, they have an unusually low perturbation of the radiation field under measurement. Due to this small size and the increased sensitivity of the ECEPF dosimeter over current techniques of measuring neutrons in a high photon field, the fast neutron contamination in the primary x-ray beam of all the investigated accelerators was measured with precision and found to be greater than that suggested by the other, more common, neutron dosimetry methods.

  13. Artificial Intelligence techniques for big data analysis

    OpenAIRE

    Aditya Khatri

    2017-01-01

    During my stay in Salamanca (Spain), I was fortunate enough to participate in the BISITE Research Group of the University of Salamanca. The University of Salamanca is the oldest university in Spain and in 2018 it celebrates its 8th centenary. As a computer science researcher, I participated in one of the many international projects that the research group has active, especially in big data analysis using Artificial Intelligence (AI) techniques. AI is one of BISITE's main lines of rese...

  14. Applications Of Binary Image Analysis Techniques

    Science.gov (United States)

    Tropf, H.; Enderle, E.; Kammerer, H. P.

    1983-10-01

    After discussing the conditions where binary image analysis techniques can be used, three new applications of the fast binary image analysis system S.A.M. (Sensorsystem for Automation and Measurement) are reported: (1) The human view direction is measured at TV frame rate while the subject's head is free movable. (2) Industrial parts hanging on a moving conveyor are classified prior to spray painting by robot. (3) In automotive wheel assembly, the eccentricity of the wheel is minimized by turning the tyre relative to the rim in order to balance the eccentricity of the components.

  15. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  16. The development of human behavior analysis techniques

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang.

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator's physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs

  17. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  18. A hybrid online scheduling mechanism with revision and progressive techniques for autonomous Earth observation satellite

    Science.gov (United States)

    Li, Guoliang; Xing, Lining; Chen, Yingwu

    2017-11-01

    The autonomicity of self-scheduling on Earth observation satellite and the increasing scale of satellite network attract much attention from researchers in the last decades. In reality, the limited onboard computational resource presents challenge for the online scheduling algorithm. This study considered online scheduling problem for a single autonomous Earth observation satellite within satellite network environment. It especially addressed that the urgent tasks arrive stochastically during the scheduling horizon. We described the problem and proposed a hybrid online scheduling mechanism with revision and progressive techniques to solve this problem. The mechanism includes two decision policies, a when-to-schedule policy combining periodic scheduling and critical cumulative number-based event-driven rescheduling, and a how-to-schedule policy combining progressive and revision approaches to accommodate two categories of task: normal tasks and urgent tasks. Thus, we developed two heuristic (re)scheduling algorithms and compared them with other generally used techniques. Computational experiments indicated that the into-scheduling percentage of urgent tasks in the proposed mechanism is much higher than that in periodic scheduling mechanism, and the specific performance is highly dependent on some mechanism-relevant and task-relevant factors. For the online scheduling, the modified weighted shortest imaging time first and dynamic profit system benefit heuristics outperformed the others on total profit and the percentage of successfully scheduled urgent tasks.

  19. Improvement and verification of fast reactor safety analysis techniques

    International Nuclear Information System (INIS)

    Jackson, J.F.

    1975-01-01

    An initial analysis of the KIWI-TNT experiment using the VENUS-II disassembly code has been completed. The calculated fission energy release agreed with the experimental value to within about 3 percent. An initial model for analyzing the SNAPTRAN-2 core disassembly experiment was also developed along with an appropriate equation-of-state. The first phase of the VENUS-II/PAD comparison study was completed through the issuing of a preliminary report describing the results. A new technique to calculate a P-V-work curve as a function of the degree of core expansion following a disassembly excursion has been developed. The technique provides results that are consistent with the ANL oxide-fuel equation-of-state in VENUS-II. Evaluation and check-out of this new model are currently in progress

  20. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  1. A new analysis technique for microsamples

    International Nuclear Information System (INIS)

    Boyer, R.; Journoux, J.P.; Duval, C.

    1989-01-01

    For many decades, isotopic analysis of Uranium or Plutonium has been performed by mass spectrometry. The most recent analytical techniques, using the counting method or a plasma torch combined with a mass spectrometer (ICP.MS) have not yet to reach a greater degree of precision than the older methods in this field. The two means of ionization for isotopic analysis - by electronic bombardment of atoms or molecules (source of gas ions) and - by thermal effect (thermoionic source) are compared revealing some inconsistency between the quantity of sample necessary for analysis and the luminosity. In fact, the quantity of sample necessary for the gas source mass spectrometer is 10 to 20 times greater than that for the thermoionization spectrometer, while the sample consumption is between 10 5 to 10 6 times greater. This proves that almost the entire sample is not necessary for the measurement; it is only required because of the system of introduction for the gas spectrometer. The new analysis technique referred to as ''Microfluorination'' corrects this anomaly and exploits the advantages of the electron bombardment method of ionization

  2. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  3. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1982-01-01

    This paper describes a fault tree analysis package that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and project delays. The package operates interactively, allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis, system data can be derived automatically from a generic data bank. As the analysis proceeds, improved estimates of critical failure rates and test and maintenance schedules can be inserted. The technique is applied to the reliability analysis of the recently upgraded HIFAR Containment Isolation System. (author)

  4. Applications of remote sensing techniques to the assessment of dam safety: A progress report

    International Nuclear Information System (INIS)

    Bowlby, J.R.; Grass, J.D.; Singhroy, V.H.

    1990-01-01

    Remote sensing detection and data collection techniques, combined with data from image analyses, have become effective tools that can be used for rapid identification, interpretation and evaluation of the geological and environmental information required in some areas of performance analysis of hydraulic dams. Potential geological hazards to dams such as faults, landslides and liquefaction, regional crustal warping or tilting, stability of foundation materials, flooding and volcanic hazards are applications in which remote sensing may aid analysis. Details are presented of remote sensing techiques, optimal time of data acquisition, interpreting techniques, and application. Techniques include LANDSAT thematic mapper (TM), SPOT images, thermal infrared scanning, colour infrared photography, normal colour photography, panchromatic black and white, normal colour video, infrared video, airborne multi-spectral electronic imagery, airborne synthetic aperture radar, side scan sonar, and LIDAR (optical radar). 3 tabs

  5. Interferogram analysis using the Abel inversion technique

    International Nuclear Information System (INIS)

    Yusof Munajat; Mohamad Kadim Suaidi

    2000-01-01

    High speed and high resolution optical detection system were used to capture the image of acoustic waves propagation. The freeze image in the form of interferogram was analysed to calculate the transient pressure profile of the acoustic waves. The interferogram analysis was based on the fringe shift and the application of the Abel inversion technique. An easier approach was made by mean of using MathCAD program as a tool in the programming; yet powerful enough to make such calculation, plotting and transfer of file. (Author)

  6. The effects of progressive muscular relaxation and breathing control technique on blood pressure during pregnancy

    Directory of Open Access Journals (Sweden)

    Mahboobeh Aalami

    2016-01-01

    Full Text Available Background: Hypertensive disorders in pregnancy are the main cause of maternal and fetal mortality; however, they have no definite effective treatment. The researchers aimed to study the effects of progressive muscular relaxation and breathing control technique on blood pressure (BP during pregnancy. Materials and Methods: This three-group clinical trial was conducted in Mashhad health centers and governmental hospitals. Sixty pregnant (after 20 weeks of gestational age women with systolic BP ≥ 135 mmHg or diastolic BP ≥ 85 mmHg were assigned to three groups. Progressive muscular relaxation and breathing control exercises were administered to the two experimental groups once a week in person and in the rest of the days by instructions given on a CD for 4 weeks. BP was checked before and after the interventions. BP was measured before and after 15 min subjects' waiting without any especial intervention in the control group. Results: After 4 weeks of intervention, the systolic (by a mean of 131.3 to 117.2, P = 0.001 and by a mean of 131.05 to 120.5, P = 0.004, respectively and diastolic (by a mean of 79.2 to 72.3, P = 0.001 and by a mean of 80.1 to 76.5, P = 0.047, respectively BPs were significantly decreased in progressive muscular relaxation and breathing control groups, but they were not statistically significant in the control group. Conclusions: The interventions were effective on decreasing systolic and diastolic BP to normal range after 4 weeks in both the groups. The effects of both the interventions were more obvious on systolic BP compared to diastolic BP.

  7. Recent progress in the melt-process technique of high-temperature superconductors

    CERN Document Server

    Ikuta, H; Mizutani, U

    1999-01-01

    Recently, the performance of high-temperature super conductors prepared by the melt-process technique has been greatly improved. This progress was accomplished by the addition of Ag into the starting materials of the Sm-Ba-CuO $9 system, which prevents the formation of severe macro-sized cracks in the finished samples. The magnetic flux density trapped by this material has now reached 9 T at 25 K, which is comparable to the magnetic flux density produced by $9 ordinary superconducting magnets. The amount of magnetic flux density that can be trapped by the sample is limited by the mechanical strength rather than superconducting properties of the material. The increase in the mechanical $9 strength of the material is important both for further improvement of the material properties and for ensuring reliability of the material in practical applications. (20 refs).

  8. Progress of Space Charge Research on Oil-Paper Insulation Using Pulsed Electroacoustic Techniques

    Directory of Open Access Journals (Sweden)

    Chao Tang

    2016-01-01

    Full Text Available This paper focuses on the space charge behavior in oil-paper insulation systems used in power transformers. It begins with the importance of understanding the space charge behavior in oil-paper insulation systems, followed by the introduction of the pulsed electrostatic technique (PEA. After that, the research progress on the space charge behavior of oil-paper insulation during the recent twenty years is critically reviewed. Some important aspects such as the environmental conditions and the acoustic wave recovery need to be addressed to acquire more accurate space charge measurement results. Some breakthroughs on the space charge behavior of oil-paper insulation materials by the research team at the University of Southampton are presented. Finally, future work on space charge measurement of oil-paper insulation materials is proposed.

  9. Low energy analysis techniques for CUORE

    Energy Technology Data Exchange (ETDEWEB)

    Alduino, C.; Avignone, F.T.; Chott, N.; Creswick, R.J.; Rosenfeld, C.; Wilson, J. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); Alfonso, K.; Huang, H.Z.; Sakai, M.; Schmidt, J. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Artusa, D.R.; Rusconi, C. [University of South Carolina, Department of Physics and Astronomy, Columbia, SC (United States); INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Azzolini, O.; Camacho, A.; Keppel, G.; Palmieri, V.; Pira, C. [INFN-Laboratori Nazionali di Legnaro, Padua (Italy); Bari, G.; Deninno, M.M. [INFN-Sezione di Bologna, Bologna (Italy); Beeman, J.W. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); Bellini, F.; Cosmelli, C.; Ferroni, F.; Piperno, G. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Benato, G.; Singh, V. [University of California, Department of Physics, Berkeley, CA (United States); Bersani, A.; Caminata, A. [INFN-Sezione di Genova, Genoa (Italy); Biassoni, M.; Brofferio, C.; Capelli, S.; Carniti, P.; Cassina, L.; Chiesa, D.; Clemenza, M.; Faverzani, M.; Fiorini, E.; Gironi, L.; Gotti, C.; Maino, M.; Nastasi, M.; Nucciotti, A.; Pavan, M.; Pozzi, S.; Sisti, M.; Terranova, F.; Zanotti, L. [Universita di Milano-Bicocca, Dipartimento di Fisica, Milan (Italy); INFN-Sezione di Milano Bicocca, Milan (Italy); Branca, A.; Taffarello, L. [INFN-Sezione di Padova, Padua (Italy); Bucci, C.; Cappelli, L.; D' Addabbo, A.; Gorla, P.; Pattavina, L.; Pirro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Canonica, L. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Massachusetts Institute of Technology, Cambridge, MA (United States); Cao, X.G.; Fang, D.Q.; Ma, Y.G.; Wang, H.W.; Zhang, G.Q. [Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai (China); Cardani, L.; Casali, N.; Dafinei, I.; Morganti, S.; Mosteiro, P.J.; Tomei, C.; Vignati, M. [INFN-Sezione di Roma, Rome (Italy); Copello, S.; Di Domizio, S.; Marini, L.; Pallavicini, M. [INFN-Sezione di Genova, Genoa (Italy); Universita di Genova, Dipartimento di Fisica, Genoa (Italy); Cremonesi, O.; Ferri, E.; Giachero, A.; Pessina, G.; Previtali, E. [INFN-Sezione di Milano Bicocca, Milan (Italy); Cushman, J.S.; Davis, C.J.; Heeger, K.M.; Lim, K.E.; Maruyama, R.H. [Yale University, Department of Physics, New Haven, CT (United States); D' Aguanno, D.; Pagliarone, C.E. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita degli Studi di Cassino e del Lazio Meridionale, Dipartimento di Ingegneria Civile e Meccanica, Cassino (Italy); Dell' Oro, S. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); INFN-Gran Sasso Science Institute, L' Aquila (Italy); Di Vacri, M.L.; Santone, D. [INFN-Laboratori Nazionali del Gran Sasso, L' Aquila (Italy); Universita dell' Aquila, Dipartimento di Scienze Fisiche e Chimiche, L' Aquila (Italy); Drobizhev, A.; Hennings-Yeomans, R.; Kolomensky, Yu.G.; Wagaarachchi, S.L. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Franceschi, M.A.; Ligi, C.; Napolitano, T. [INFN-Laboratori Nazionali di Frascati, Rome (Italy); Freedman, S.J. [University of California, Department of Physics, Berkeley, CA (United States); Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Fujikawa, B.K.; Mei, Y.; Schmidt, B.; Smith, A.R.; Welliver, B. [Lawrence Berkeley National Laboratory, Nuclear Science Division, Berkeley, CA (United States); Giuliani, A.; Novati, V. [Universite Paris-Saclay, CSNSM, Univ. Paris-Sud, CNRS/IN2P3, Orsay (France); Gladstone, L.; Leder, A.; Ouellet, J.L.; Winslow, L.A. [Massachusetts Institute of Technology, Cambridge, MA (United States); Gutierrez, T.D. [California Polytechnic State University, Physics Department, San Luis Obispo, CA (United States); Haller, E.E. [Lawrence Berkeley National Laboratory, Materials Science Division, Berkeley, CA (United States); University of California, Department of Materials Science and Engineering, Berkeley, CA (United States); Han, K. [Shanghai Jiao Tong University, Department of Physics and Astronomy, Shanghai (China); Hansen, E. [University of California, Department of Physics and Astronomy, Los Angeles, CA (United States); Massachusetts Institute of Technology, Cambridge, MA (United States); Kadel, R. [Lawrence Berkeley National Laboratory, Physics Division, Berkeley, CA (United States); Martinez, M. [Sapienza Universita di Roma, Dipartimento di Fisica, Rome (Italy); INFN-Sezione di Roma, Rome (Italy); Universidad de Zaragoza, Laboratorio de Fisica Nuclear y Astroparticulas, Saragossa (Spain); Moggi, N.; Zucchelli, S. [INFN-Sezione di Bologna, Bologna (Italy); Universita di Bologna - Alma Mater Studiorum, Dipartimento di Fisica e Astronomia, Bologna (IT); Nones, C. [CEA/Saclay, Service de Physique des Particules, Gif-sur-Yvette (FR); Norman, E.B.; Wang, B.S. [Lawrence Livermore National Laboratory, Livermore, CA (US); University of California, Department of Nuclear Engineering, Berkeley, CA (US); O' Donnell, T. [Virginia Polytechnic Institute and State University, Center for Neutrino Physics, Blacksburg, VA (US); Sangiorgio, S.; Scielzo, N.D. [Lawrence Livermore National Laboratory, Livermore, CA (US); Wise, T. [Yale University, Department of Physics, New Haven, CT (US); University of Wisconsin, Department of Physics, Madison, WI (US); Woodcraft, A. [University of Edinburgh, SUPA, Institute for Astronomy, Edinburgh (GB); Zimmermann, S. [Lawrence Berkeley National Laboratory, Engineering Division, Berkeley, CA (US)

    2017-12-15

    CUORE is a tonne-scale cryogenic detector operating at the Laboratori Nazionali del Gran Sasso (LNGS) that uses tellurium dioxide bolometers to search for neutrinoless double-beta decay of {sup 130}Te. CUORE is also suitable to search for low energy rare events such as solar axions or WIMP scattering, thanks to its ultra-low background and large target mass. However, to conduct such sensitive searches requires improving the energy threshold to 10 keV. In this paper, we describe the analysis techniques developed for the low energy analysis of CUORE-like detectors, using the data acquired from November 2013 to March 2015 by CUORE-0, a single-tower prototype designed to validate the assembly procedure and new cleaning techniques of CUORE. We explain the energy threshold optimization, continuous monitoring of the trigger efficiency, data and event selection, and energy calibration at low energies in detail. We also present the low energy background spectrum of CUORE-0 below 60 keV. Finally, we report the sensitivity of CUORE to WIMP annual modulation using the CUORE-0 energy threshold and background, as well as an estimate of the uncertainty on the nuclear quenching factor from nuclear recoils in CUORE-0. (orig.)

  10. Machine monitoring via current signature analysis techniques

    International Nuclear Information System (INIS)

    Smith, S.F.; Castleberry, K.N.; Nowlin, C.H.

    1992-01-01

    A significant need in the effort to provide increased production quality is to provide improved plant equipment monitoring capabilities. Unfortunately, in today's tight economy, even such monitoring instrumentation must be implemented in a recognizably cost effective manner. By analyzing the electric current drawn by motors, actuator, and other line-powered industrial equipment, significant insights into the operations of the movers, driven equipment, and even the power source can be obtained. The generic term 'current signature analysis' (CSA) has been coined to describe several techniques for extracting useful equipment or process monitoring information from the electrical power feed system. A patented method developed at Oak Ridge National Laboratory is described which recognizes the presence of line-current modulation produced by motors and actuators driving varying loads. The in-situ application of applicable linear demodulation techniques to the analysis of numerous motor-driven systems is also discussed. The use of high-quality amplitude and angle-demodulation circuitry has permitted remote status monitoring of several types of medium and high-power gas compressors in (US DOE facilities) driven by 3-phase induction motors rated from 100 to 3,500 hp, both with and without intervening speed increasers. Flow characteristics of the compressors, including various forms of abnormal behavior such as surging and rotating stall, produce at the output of the specialized detectors specific time and frequency signatures which can be easily identified for monitoring, control, and fault-prevention purposes. The resultant data are similar in form to information obtained via standard vibration-sensing techniques and can be analyzed using essentially identical methods. In addition, other machinery such as refrigeration compressors, brine pumps, vacuum pumps, fans, and electric motors have been characterized

  11. Recent progress in safety-related applications of reactor noise analysis

    International Nuclear Information System (INIS)

    Hirota, Jitsuya; Shinohara, Yoshikuni; Saito, Keiichi

    1982-01-01

    Recent progress in safety-related applications of reactor noise analysis is reviewed, mainly referring to various papers presented at the Third Specialists' Meeting on Reactor Noise (SMORN-III) held in Tokyo in 1981. Advances in application of autoregressive model, coherence analysis and pattern recognition technique are significant since SMORN-II in 1977. Development of reactor diagnosis systems based on noise analysis is in progress. Practical experiences in the safety-related applications to power plants are being accumulated. Advances in quantitative monitoring of vibration of internal structures in PWR and diagnosis of core stability and control system characteristics in BWR are notable. Acoustic methods are also improved to detect sodium boiling in LMFBR. The Reactor Noise Analysis Benchmark Test performed by Japan in connection with SMORN-III is successful so that it is possible to proceed to the second stage of the benchmark test. (author)

  12. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  13. Population estimation techniques for routing analysis

    International Nuclear Information System (INIS)

    Sathisan, S.K.; Chagari, A.K.

    1994-01-01

    A number of on-site and off-site factors affect the potential siting of a radioactive materials repository at Yucca Mountain, Nevada. Transportation related issues such route selection and design are among them. These involve evaluation of potential risks and impacts, including those related to population. Population characteristics (total population and density) are critical factors in the risk assessment, emergency preparedness and response planning, and ultimately in route designation. This paper presents an application of Geographic Information System (GIS) technology to facilitate such analyses. Specifically, techniques to estimate critical population information are presented. A case study using the highway network in Nevada is used to illustrate the analyses. TIGER coverages are used as the basis for population information at a block level. The data are then synthesized at tract, county and state levels of aggregation. Of particular interest are population estimates for various corridor widths along transport corridors -- ranging from 0.5 miles to 20 miles in this paper. A sensitivity analysis based on the level of data aggregation is also presented. The results of these analysis indicate that specific characteristics of the area and its population could be used as indicators to aggregate data appropriately for the analysis

  14. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  15. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  16. Radio-analysis. Definitions and techniques

    International Nuclear Information System (INIS)

    Bourrel, F.; Courriere, Ph.

    2003-01-01

    This paper presents the different steps of the radio-labelling of a molecule for two purposes: the radio-immuno-analysis and the auto-radiography: 1 - definitions, radiations and radioprotection: activity of a radioactive source; half-life; radioactivity (alpha-, beta- and gamma radioactivity, internal conversion); radioprotection (irradiation, contamination); 2 - radionuclides used in medical biology and obtention of labelled molecules: gamma emitters ( 125 I, 57 Co); beta emitters; obtention of labelled molecules (general principles, high specific activity and choice of the tracer, molecule to be labelled); main labelling techniques (iodation, tritium); purification of the labelled compound (dialysis, gel-filtering or molecular exclusion chromatography, high performance liquid chromatography); quality estimation of the labelled compound (labelling efficiency calculation, immuno-reactivity conservation, stability and preservation). (J.S.)

  17. [Research progresses of anabolic steroids analysis in doping control].

    Science.gov (United States)

    Long, Yuanyuan; Wang, Dingzhong; Li, Ke'an; Liu, Feng

    2008-07-01

    Anabolic steroids, a kind of physiological active substance, are widely abused to improve athletic performance in human sports. They have been forbidden in sports by the International Olympic Committee since 1983. Since then, many researchers have been focusing their attentions on the establishment of reliable detection methods. In this paper, we review the research progresses of different analytical methods for anabolic steroids since 2002, such as gas chromatography-mass spectrometry, liquid chromatography-mass spectrometry, immunoassay, electrochemistry analysis and mass spectrometry. The developing prospect of anabolic steroids analysis is also discussed.

  18. Extracellular Vesicle Heterogeneity: Subpopulations, Isolation Techniques, and Diverse Functions in Cancer Progression.

    Science.gov (United States)

    Willms, Eduard; Cabañas, Carlos; Mäger, Imre; Wood, Matthew J A; Vader, Pieter

    2018-01-01

    Cells release membrane enclosed nano-sized vesicles termed extracellular vesicles (EVs) that function as mediators of intercellular communication by transferring biological information between cells. Tumor-derived EVs have emerged as important mediators in cancer development and progression, mainly through transfer of their bioactive content which can include oncoproteins, oncogenes, chemokine receptors, as well as soluble factors, transcripts of proteins and miRNAs involved in angiogenesis or inflammation. This transfer has been shown to influence the metastatic behavior of primary tumors. Moreover, tumor-derived EVs have been shown to influence distant cellular niches, establishing favorable microenvironments that support growth of disseminated cancer cells upon their arrival at these pre-metastatic niches. It is generally accepted that cells release a number of major EV populations with distinct biophysical properties and biological functions. Exosomes, microvesicles, and apoptotic bodies are EV populations most widely studied and characterized. They are discriminated based primarily on their intracellular origin. However, increasing evidence suggests that even within these EV populations various subpopulations may exist. This heterogeneity introduces an extra level of complexity in the study of EV biology and function. For example, EV subpopulations could have unique roles in the intricate biological processes underlying cancer biology. Here, we discuss current knowledge regarding the role of subpopulations of EVs in cancer development and progression and highlight the relevance of EV heterogeneity. The position of tetraspanins and integrins therein will be highlighted. Since addressing EV heterogeneity has become essential for the EV field, current and novel techniques for isolating EV subpopulations will also be discussed. Further dissection of EV heterogeneity will advance our understanding of the critical roles of EVs in health and disease.

  19. Extracellular Vesicle Heterogeneity: Subpopulations, Isolation Techniques, and Diverse Functions in Cancer Progression

    Directory of Open Access Journals (Sweden)

    Eduard Willms

    2018-04-01

    Full Text Available Cells release membrane enclosed nano-sized vesicles termed extracellular vesicles (EVs that function as mediators of intercellular communication by transferring biological information between cells. Tumor-derived EVs have emerged as important mediators in cancer development and progression, mainly through transfer of their bioactive content which can include oncoproteins, oncogenes, chemokine receptors, as well as soluble factors, transcripts of proteins and miRNAs involved in angiogenesis or inflammation. This transfer has been shown to influence the metastatic behavior of primary tumors. Moreover, tumor-derived EVs have been shown to influence distant cellular niches, establishing favorable microenvironments that support growth of disseminated cancer cells upon their arrival at these pre-metastatic niches. It is generally accepted that cells release a number of major EV populations with distinct biophysical properties and biological functions. Exosomes, microvesicles, and apoptotic bodies are EV populations most widely studied and characterized. They are discriminated based primarily on their intracellular origin. However, increasing evidence suggests that even within these EV populations various subpopulations may exist. This heterogeneity introduces an extra level of complexity in the study of EV biology and function. For example, EV subpopulations could have unique roles in the intricate biological processes underlying cancer biology. Here, we discuss current knowledge regarding the role of subpopulations of EVs in cancer development and progression and highlight the relevance of EV heterogeneity. The position of tetraspanins and integrins therein will be highlighted. Since addressing EV heterogeneity has become essential for the EV field, current and novel techniques for isolating EV subpopulations will also be discussed. Further dissection of EV heterogeneity will advance our understanding of the critical roles of EVs in health and

  20. Assembly homogenization techniques for light water reactor analysis

    International Nuclear Information System (INIS)

    Smith, K.S.

    1986-01-01

    Recent progress in development and application of advanced assembly homogenization methods for light water reactor analysis is reviewed. Practical difficulties arising from conventional flux-weighting approximations are discussed and numerical examples given. The mathematical foundations for homogenization methods are outlined. Two methods, Equivalence Theory and Generalized Equivalence Theory which are theoretically capable of eliminating homogenization error are reviewed. Practical means of obtaining approximate homogenized parameters are presented and numerical examples are used to contrast the two methods. Applications of these techniques to PWR baffle/reflector homogenization and BWR bundle homogenization are discussed. Nodal solutions to realistic reactor problems are compared to fine-mesh PDQ calculations, and the accuracy of the advanced homogenization methods is established. Remaining problem areas are investigated, and directions for future research are suggested. (author)

  1. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  2. Analysis of obsidians by PIXE technique

    International Nuclear Information System (INIS)

    Nuncio Q, A.E.

    1998-01-01

    This work presents the characterization of obsydian samples from different mineral sites in Mexico, undertaken by an Ion Beam Analysis: PIXE (Proton Induced X-ray Emission). As part of an intensive investigation of obsidian in Mesoamerica by anthropologists from Mexico National Institute of Anthropology and History, 818 samples were collected from different volcanic sources in central Mexico for the purpose of establishing a data bank of element concentrations of each source. Part of this collection was analyzed by Neutron activation analysis and most of the important elements concentrations reported. In this work, a non-destructive IBA technique (PIXE) are used to analyze obsydian samples. The application of this technique were carried out at laboratories of the ININ Nuclear Center facilities. The samples consisted of of obsydians from ten different volcanic sources. This pieces were mounted on a sample holder designed for the purpose of exposing each sample to the proton beam. This PIXE analysis was carried out with an ET Tandem Accelerator at the ININ. X-ray spectrometry was carried out with an external beam facility employing a Si(Li) detector set at 52.5 degrees in relation to the target normal (parallel to the beam direction) and 4.2 cm away from the target center. A filter was set in front of the detector, to determine the best attenuation conditions to obtain most of the elements, taking into account that X-ray spectra from obsydians are dominated by intense major elements lines. Thus, a 28 μ m- thick aluminium foil absorber was selected and used to reduce the intensity of the major lines as well as pile-up effects. The mean proton energy was 2.62 MeV, and the beam profile was about 4 mm in diameter. As results were founded elemental concentrations of a set of samples from ten different sources: Altotonga (Veracruz), Penjamo (Guanajuato), Otumba (Mexico), Zinapecuaro (Michoacan), Ucareo (Michoacan), Tres Cabezas (Puebla), Sierra Navajas (Hidalgo), Zaragoza

  3. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  4. The speed of progress in the apparatus handling technique in rhythmic gymnastics

    Directory of Open Access Journals (Sweden)

    Moskovljević Lidija

    2013-01-01

    Full Text Available Specificity of rhythmic gymnastics as a sport and as a teaching device are apparatus routines. Considering lack of researches, the aim of our study was to determine ages of maturity when the development in apparatus routines performance is greater. Development in essential rope, hoop and ball routine performance was examined two times per year, through four years experimental period. The evaluation is carried out three-member RG-expert committee on a scale of 1 to 10. A total of twenty-seven competitors, examined at ages seven to fourteen, participate in this study. Based on data, we can notice that speed of progress in apparatus handling technique was not equal during observing maturity period. There was not significant development in most of examined routines between seven to nine years of ages. Significant development in this period has been achieved only in two rope routines (Vij1 i Vij2R and one ball routine to (Lop2R. From eleven to twelve years of ages, significant development has been achieved for most of routines, except basic running with rope (Vij1 and hoop routine performed with weaker arm (Obr2L. At 12 to 13 years of ages, development of routines performance has not been statistically significant.

  5. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  6. Nano-Aptasensing in Mycotoxin Analysis: Recent Updates and Progress

    Directory of Open Access Journals (Sweden)

    Amina Rhouati

    2017-10-01

    Full Text Available Recent years have witnessed an overwhelming integration of nanomaterials in the fabrication of biosensors. Nanomaterials have been incorporated with the objective to achieve better analytical figures of merit in terms of limit of detection, linear range, assays stability, low production cost, etc. Nanomaterials can act as immobilization support, signal amplifier, mediator and artificial enzyme label in the construction of aptasensors. We aim in this work to review the recent progress in mycotoxin analysis. This review emphasizes on the function of the different nanomaterials in aptasensors architecture. We subsequently relate their features to the analytical performance of the given aptasensor towards mycotoxins monitoring. In the same context, a critically analysis and level of success for each nano-aptasensing design will be discussed. Finally, current challenges in nano-aptasensing design for mycotoxin analysis will be highlighted.

  7. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    Science.gov (United States)

    Mirizzi, F.

    2014-02-01

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper.

  8. Progress in the RAMI analysis of a conceptual LHCD system for DEMO

    International Nuclear Information System (INIS)

    Mirizzi, F.

    2014-01-01

    Reliability, Availability, Maintainability and Inspectability (RAMI) concepts and techniques, that acquired great importance during the first manned space missions, have been progressively extended to industrial, scientific and consumer equipments to assure them satisfactory performances and lifetimes. In the design of experimental facilities, like tokamaks, mainly aimed at demonstrating validity and feasibility of scientific theories, RAMI analysis has been often left aside. DEMO, the future prototype fusion reactors, will be instead designed for steadily delivering electrical energy to commercial grids, so that the RAMI aspects will assume an absolute relevance since their initial design phases. A preliminary RAMI analysis of the LHCD system for the conceptual EU DEMO reactor is given in the paper

  9. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  10. Progressive reduction of the thermal wall system by modal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Mokhtari, A.; Meslem, A.; Bounif, A.; Kadi, L. [Universite des Sciences et de la Technologie, Oran (Algeria)

    1993-12-31

    A reduction method of thermal systems called ``progress`` using the modal Analysis is presented. It allows to do, at each time of simulation, a synthesis information in the system evolution. Consequently, the limited number of descriptive and significant parameters (proper modes), can produce some extremely useful indication about dynamic evolution. However this method can eliminate proper modes of which the energetic contribution will be neglected or amortized. Some examples were studied, showing the efficiency of this method by reducing the computing time, as well as, having high precision on predicted dynamic response over time of simulation. (Authors). 4 refs., 4 figs.

  11. Quantitative blood flow analysis with digital techniques

    International Nuclear Information System (INIS)

    Forbes, G.

    1984-01-01

    The general principles of digital techniques in quantitating absolute blood flow during arteriography are described. Results are presented for a phantom constructed to correlate digitally calculated absolute flow with direct flow measurements. The clinical use of digital techniques in cerebrovascular angiography is briefly described. (U.K.)

  12. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  13. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  14. Application of functional analysis techniques to supervisory systems

    International Nuclear Information System (INIS)

    Lambert, Manuel; Riera, Bernard; Martel, Gregory

    1999-01-01

    The aim of this paper is to apply firstly two interesting functional analysis techniques for the design of supervisory systems for complex processes, and secondly to discuss the strength and the weaknesses of each of them. Two functional analysis techniques have been applied, SADT (Structured Analysis and Design Technique) and FAST (Functional Analysis System Technique) on a process, an example of a Water Supply Process Control (WSPC) system. These techniques allow a functional description of industrial processes. The paper briefly discusses the functions of a supervisory system and some advantages of the application of functional analysis for the design of a 'human' centered supervisory system. Then the basic principles of the two techniques applied on the WSPC system are presented. Finally, the different results obtained from the two techniques are discussed

  15. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  16. Defect analysis program for LOFT. Progress report, 1977

    International Nuclear Information System (INIS)

    Doyle, R.E.; Scoonover, T.M.

    1978-03-01

    In order to alleviate problems encountered while performing previous defect analyses on components of the LOFT system, regions of LOFT most likely to require defect analysis have been identified. A review of available documentation has been conducted to identify shapes, sizes, materials, and welding procedures and to compile mechanical property data. The LOFT Reactor Vessel Material Surveillance Program has also been reviewed, and a survey of available literature describing existing techniques for conducting elastic-plastic defect analysis was initiated. While large amounts of mechanical property data were obtained from the available documentation and the literature, much information was not available, especially for weld heat-affected zones. Therefore, a program of mechanical property testing is recommended for FY-78 as well as continued literature search. It is also recommended that fatigue-crack growth-rate data be sought from the literature and that evaluation of the various techniques of elastic-plastic defect analysis be continued. Review of additional regions of the LOFT system in the context of potential defect analysis will be conducted as time permits

  17. MCNP perturbation technique for criticality analysis

    International Nuclear Information System (INIS)

    McKinney, G.W.; Iverson, J.L.

    1995-01-01

    The differential operator perturbation technique has been incorporated into the Monte Carlo N-Particle transport code MCNP and will become a standard feature of future releases. This feature includes first and/or second order terms of the Taylor Series expansion for response perturbations related to cross-section data (i.e., density, composition, etc.). Criticality analyses can benefit from this technique in that predicted changes in the track-length tally estimator of K eff may be obtained for multiple perturbations in a single run. A key advantage of this method is that a precise estimate of a small change in response (i.e., < 1%) is easily obtained. This technique can also offer acceptable accuracy, to within a few percent, for up to 20-30% changes in a response

  18. Scenario development and analysis in JNC'S second progress report

    International Nuclear Information System (INIS)

    Umeki, H.; Makino, H.; Miyahara, K.; Naito, M.

    2001-01-01

    Scenario development and analysis is an integral part of the performance assessment in the JNC's second progress report which will be issued by the end of November 1999. A systematic approach has been elaborated to ensure traceability and transparency in overall context of the scenario development and set up of calculation cases for assessment of the repository performance. In this approach, the hierarchical FEP matrix was designed to flexibly identify FEPs at different level of detail. The reasoned argument with clearly defined criteria was then applied for screening and grouping of FEPs to define scenarios in the form of influence diagrams. Scenarios and calculation cases were developed based on the expected safety functions of disposal system and relationships with potential detrimental/favorable factors and perturbation factors. The process to develop scenarios and calculation cases are recorded and managed in a computer system. (authors)

  19. Data Analysis Techniques for Physical Scientists

    Science.gov (United States)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  20. Surface analysis and techniques in biology

    CERN Document Server

    Smentkowski, Vincent S

    2014-01-01

    This book highlights state-of-the-art surface analytical instrumentation, advanced data analysis tools, and the use of complimentary surface analytical instrumentation to perform a complete analysis of biological systems.

  1. Progressive Damage and Failure Analysis of Composite Laminates

    Science.gov (United States)

    Joseph, Ashith P. K.

    Composite materials are widely used in various industries for making structural parts due to higher strength to weight ratio, better fatigue life, corrosion resistance and material property tailorability. To fully exploit the capability of composites, it is required to know the load carrying capacity of the parts made of them. Unlike metals, composites are orthotropic in nature and fails in a complex manner under various loading conditions which makes it a hard problem to analyze. Lack of reliable and efficient failure analysis tools for composites have led industries to rely more on coupon and component level testing to estimate the design space. Due to the complex failure mechanisms, composite materials require a very large number of coupon level tests to fully characterize the behavior. This makes the entire testing process very time consuming and costly. The alternative is to use virtual testing tools which can predict the complex failure mechanisms accurately. This reduces the cost only to it's associated computational expenses making significant savings. Some of the most desired features in a virtual testing tool are - (1) Accurate representation of failure mechanism: Failure progression predicted by the virtual tool must be same as those observed in experiments. A tool has to be assessed based on the mechanisms it can capture. (2) Computational efficiency: The greatest advantages of a virtual tools are the savings in time and money and hence computational efficiency is one of the most needed features. (3) Applicability to a wide range of problems: Structural parts are subjected to a variety of loading conditions including static, dynamic and fatigue conditions. A good virtual testing tool should be able to make good predictions for all these different loading conditions. The aim of this PhD thesis is to develop a computational tool which can model the progressive failure of composite laminates under different quasi-static loading conditions. The analysis

  2. Silicon ribbon growth by a capillary action shaping technique. Annual report (Quarterly technical progress report No. 9)

    Energy Technology Data Exchange (ETDEWEB)

    Schwuttke, G.H.; Ciszek, T.F.; Kran, A.

    1977-10-01

    Progress on the technological and economical assessment of ribbon growth of silicon by a capillary action shaping technique is reported. Progress in scale-up of the process from 50 mm to 100 mm ribbon widths is presented, the use of vitreous carbon as a crucible material is analyzed, and preliminary tests of CVD Si/sub 3/N/sub 4/ as a potential die material are reported. Diffusion length measurements by SEM, equipment and procedure for defect display under MOS structure in silicon ribbon for lifetime interpretation, and an assessment of ribbon technology are discussed. (WHK)

  3. Image analysis software for following progression of peripheral neuropathy

    Science.gov (United States)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  4. Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques

    Science.gov (United States)

    2018-04-30

    Resources: N/A TOTAL: $18,687 2 TECHNICAL STATUS REPORT Abstract The program goal is analysis of sea ice dynamical behavior using Koopman Mode Decompo...Title: Analysis and Prediction of Sea Ice Evolution using Koopman Mode Decomposition Techniques Subject: Monthly Progress Report Period of...Attn: Code 5596 4555 Overlook Avenue, SW Washington, D.C. 20375-5320 E-mail: reports@library.nrl.navy.mil Defense Technical Information Center

  5. Progress Towards Improved Analysis of TES X-ray Data Using Principal Component Analysis

    Science.gov (United States)

    Busch, S. E.; Adams, J. S.; Bandler, S. R.; Chervenak, J. A.; Eckart, M. E.; Finkbeiner, F. M.; Fixsen, D. J.; Kelley, R. L.; Kilbourne, C. A.; Lee, S.-J.; hide

    2015-01-01

    The traditional method of applying a digital optimal filter to measure X-ray pulses from transition-edge sensor (TES) devices does not achieve the best energy resolution when the signals have a highly non-linear response to energy, or the noise is non-stationary during the pulse. We present an implementation of a method to analyze X-ray data from TESs, which is based upon principal component analysis (PCA). Our method separates the X-ray signal pulse into orthogonal components that have the largest variance. We typically recover pulse height, arrival time, differences in pulse shape, and the variation of pulse height with detector temperature. These components can then be combined to form a representation of pulse energy. An added value of this method is that by reporting information on more descriptive parameters (as opposed to a single number representing energy), we generate a much more complete picture of the pulse received. Here we report on progress in developing this technique for future implementation on X-ray telescopes. We used an 55Fe source to characterize Mo/Au TESs. On the same dataset, the PCA method recovers a spectral resolution that is better by a factor of two than achievable with digital optimal filters.

  6. TU-EF-BRD-02: Indicators and Technique Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlone, M. [Princess Margaret Hospital (Canada)

    2015-06-15

    peer-reviewed research will be used to highlight the main points. Historical, medical physicists have leveraged many areas of applied physics, engineering and biology to improve radiotherapy. Research on quality and safety is another area where physicists can have an impact. The key to further progress is to clearly define what constitutes quality and safety research for those interested in doing such research and the reviewers of that research. Learning Objectives: List several tools of quality and safety with references to peer-reviewed literature. Describe effects of mental workload on performance. Outline research in quality and safety indicators and technique analysis. Understand what quality and safety research needs to be going forward. Understand the links between cooperative group trials and quality and safety research.

  7. TU-EF-BRD-02: Indicators and Technique Analysis

    International Nuclear Information System (INIS)

    Carlone, M.

    2015-01-01

    peer-reviewed research will be used to highlight the main points. Historical, medical physicists have leveraged many areas of applied physics, engineering and biology to improve radiotherapy. Research on quality and safety is another area where physicists can have an impact. The key to further progress is to clearly define what constitutes quality and safety research for those interested in doing such research and the reviewers of that research. Learning Objectives: List several tools of quality and safety with references to peer-reviewed literature. Describe effects of mental workload on performance. Outline research in quality and safety indicators and technique analysis. Understand what quality and safety research needs to be going forward. Understand the links between cooperative group trials and quality and safety research

  8. Survey of immunoassay techniques for biological analysis

    International Nuclear Information System (INIS)

    Burtis, C.A.

    1986-10-01

    Immunoassay is a very specific, sensitive, and widely applicable analytical technique. Recent advances in genetic engineering have led to the development of monoclonal antibodies which further improves the specificity of immunoassays. Originally, radioisotopes were used to label the antigens and antibodies used in immunoassays. However, in the last decade, numerous types of immunoassays have been developed which utilize enzymes and fluorescent dyes as labels. Given the technical, safety, health, and disposal problems associated with using radioisotopes, immunoassays that utilize the enzyme and fluorescent labels are rapidly replacing those using radioisotope labels. These newer techniques are as sensitive, are easily automated, have stable reagents, and do not have a disposal problem. 6 refs., 1 fig., 2 tabs

  9. Hybrid chemical and nondestructive-analysis technique

    International Nuclear Information System (INIS)

    Hsue, S.T.; Marsh, S.F.; Marks, T.

    1982-01-01

    A hybrid chemical/NDA technique has been applied at the Los Alamos National Laboratory to the assay of plutonium in ion-exchange effluents. Typical effluent solutions contain low concentrations of plutonium and high concentrations of americium. A simple trioctylphosphine oxide (TOPO) separation can remove 99.9% of the americium. The organic phase that contains the separated plutonium can be accurately assayed by monitoring the uranium L x-ray intensities

  10. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting ...

  11. EVNTRE, Code System for Event Progression Analysis for PRA

    International Nuclear Information System (INIS)

    2002-01-01

    1 - Description of program or function: EVNTRE is a generalized event tree processor that was developed for use in probabilistic risk analysis of severe accident progressions for nuclear power plants. The general nature of EVNTRE makes it applicable to a wide variety of analyses that involve the investigation of a progression of events which lead to a large number of sets of conditions or scenarios. EVNTRE efficiently processes large, complex event trees. It can assign probabilities to event tree branch points in several different ways, classify pathways or outcomes into user-specified groupings, and sample input distributions of probabilities and parameters. PSTEVNT, a post-processor program used to sort and reclassify the 'binned' data output from EVNTRE and generate summary tables, is included. 2 - Methods: EVNTRE processes event trees that are cast in the form of questions or events, with multiple choice answers for each question. Split fractions (probabilities or frequencies that sum to unity) are either supplied or calculated for the branches of each question in a path-dependent manner. EVNTRE traverses the tree, enumerating the leaves of the tree and calculating their probabilities or frequencies based upon the initial probability or frequency and the split fractions for the branches taken along the corresponding path to an individual leaf. The questions in the event tree are usually grouped to address specific phases of time regimes in the progression of the scenario or severe accident. Grouping or binning of each path through the event tree in terms of a small number of characteristics or attributes is allowed. Boolean expressions of the branches taken are used to select the appropriate values of the characteristics of interest for the given path. Typically, the user specifies a cutoff tolerance for the frequency of a pathway to terminate further exploration. Multiple sets of input to an event tree can be processed by using Monte Carlo sampling to generate

  12. Calcium determination in bone by proton activation analysis. Progress report

    International Nuclear Information System (INIS)

    Wilson, R.; Adelstein, S.

    1974-01-01

    The incidence of post-menopausal osteoporosis in almost epidemic proportions makes the early diagnosis and development of effective therapy a matter of considerable concern. Current status of the project is reviewed and new applications of calcium determination by in vivo proton activation analysis are discussed. The proton activation method promises to give precise and reproducible measurements of calcium content for a single vertebra or several vertebrae in vivo. By controlling the number and energy of protons incident on a vertebra and by accurately detecting the number of 2.17 MeV gamma rays emitted, one may determine the 40Ca content. The proton technique offers advantages by directly measuring calcium in a very well-defined region. On-going studies by the construction of a lead shield for in vivo counting and for the analysis of the results are also given

  13. Developing new understanding of photoelectrochemical water splitting via in-situ techniques: A review on recent progress

    Directory of Open Access Journals (Sweden)

    Jiajie Cen

    2017-04-01

    Full Text Available Photoelectrochemical (PEC water splitting is a promising technology for solar hydrogen production to build a sustainable, renewable and clean energy economy. Given the complexity of the PEC water splitting processes, it is important to note that developing in-situ techniques for studying PEC water splitting presents a formidable challenge. This review is aimed at highlighting advantages and disadvantages of each technique, while offering a pathway of potentially combining several techniques to address different aspects of interfacial processes in PEC water splitting. We reviewed recent progress in various techniques and approaches utilized to study PEC water splitting, focusing on spectroscopic and scanning-probe methods. Keywords: In-situ, Water splitting, IMPS, TAS, SPM

  14. Visualization techniques for malware behavior analysis

    Science.gov (United States)

    Grégio, André R. A.; Santos, Rafael D. C.

    2011-06-01

    Malware spread via Internet is a great security threat, so studying their behavior is important to identify and classify them. Using SSDT hooking we can obtain malware behavior by running it in a controlled environment and capturing interactions with the target operating system regarding file, process, registry, network and mutex activities. This generates a chain of events that can be used to compare them with other known malware. In this paper we present a simple approach to convert malware behavior into activity graphs and show some visualization techniques that can be used to analyze malware behavior, individually or grouped.

  15. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    domain or in the frequency domain. However their .... computer to speech analysis led to important elaborations ... tool for the estimation of formant trajectory (10), ... prediction Linear prediction In effect determines the filter .... Radio Res. Lab.

  16. Techniques for Intelligence Analysis of Networks

    National Research Council Canada - National Science Library

    Cares, Jeffrey R

    2005-01-01

    ...) there are significant intelligence analysis manifestations of these properties; and (4) a more satisfying theory of Networked Competition than currently exists for NCW/NCO is emerging from this research...

  17. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  18. Recent Progress in Application of Internal Oxidation Technique in Nb3Sn Strands

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Xingchen [Fermilab; Peng, Xuan [Hyper Tech Research Inc.; Sumption, Michael [Ohio State U.; Collings, E. W. [Ohio State U.

    2016-10-13

    The internal oxidation technique can generate ZrO2 nano particles in Nb3Sn strands, which markedly refine the Nb3Sn grain size and boost the high-field critical current density (Jc). This article summarizes recent efforts on implementing this technique in practical Nb3Sn wires and adding Ti as a dopant. It is demonstrated that this technique can be readily incorporated into the present Nb3Sn conductor manufacturing technology. Powder-in-tube (PIT) strands with fine subelements (~25 µm) based on this technique were successfully fabricated, and proper heat treatments for oxygen transfer were explored. Future work for producing strands ready for applications is proposed.

  19. Uncertainty analysis technique for OMEGA Dante measurementsa)

    Science.gov (United States)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-10-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  20. Uncertainty analysis technique for OMEGA Dante measurements

    International Nuclear Information System (INIS)

    May, M. J.; Widmann, K.; Sorce, C.; Park, H.-S.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel x-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g., hohlraums, etc.) at x-ray energies between 50 eV and 10 keV. It is a main diagnostic installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the x-ray diodes, filters and mirrors, and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  1. Uncertainty Analysis Technique for OMEGA Dante Measurements

    International Nuclear Information System (INIS)

    May, M.J.; Widmann, K.; Sorce, C.; Park, H.; Schneider, M.

    2010-01-01

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  2. The effect of progressive muscle relaxation techniques on anxiety in Patients with myocardial infarction

    Directory of Open Access Journals (Sweden)

    Mozhgan Jariani

    2011-12-01

    Conclusion: progressive muscle relaxation can reduce the amount of anxiety, and systolic and diastolic blood pressure of the patients with myocardial infarction hospitalized in CCU ward, therefore it can play an effective role as a supplement non-medicinal, simple and cheap treatment for these patients

  3. Diffusion tensor analysis of corpus callosum in progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Ito, Shoichi; Makino, Takahiro; Shirai, Wakako; Hattori, Takamichi [Department of Neurology, Graduate School of Medicine, Chiba University (Japan)

    2008-11-15

    Progressive supranuclear palsy (PSP) is a neurodegenerative disease featuring parkinsonism, supranuclear ophthalmoplegia, dysphagia, and frontal lobe dysfunction. The corpus callosum which consists of many commissure fibers probably reflects cerebral cortical function. Several previous reports showed atrophy or diffusion abnormalities of anterior corpus callosum in PSP patients, but partitioning method used in these studies was based on data obtained in nonhuman primates. In this study, we performed a diffusion tensor analysis using a new partitioning method for the human corpus callosum. Seven consecutive patients with PSP were compared with 29 age-matched patients with Parkinson's Disease (PD) and 19 age-matched healthy control subjects. All subjects underwent diffusion tensor magnetic resonance imaging, and the corpus callosum was partitioned into five areas on the mid-sagittal plane according to a recently established topography of human corpus callosum (CC1-prefrontal area, CC2-premotor and supplementary motor area, CC3-motor area, CC4-sensory area, CC5-parietal, temporal, and occipital area). Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) were measured in each area and differences between groups were analyzed. In the PSP group, FA values were significantly decreased in CC1 and CC2, and ADC values were significantly increased in CC1 and CC2. Receiver operating characteristic analysis showed excellent reliability of FA and ADC analyses of CC1 for differentiating PSP from PD. The anterior corpus callosum corresponding to the prefrontal, premotor, and supplementary motor cortices is affected in PSP patients. This analysis can be an additional test for further confirmation of the diagnosis of PSP.

  4. Diffusion tensor analysis of corpus callosum in progressive supranuclear palsy

    International Nuclear Information System (INIS)

    Ito, Shoichi; Makino, Takahiro; Shirai, Wakako; Hattori, Takamichi

    2008-01-01

    Progressive supranuclear palsy (PSP) is a neurodegenerative disease featuring parkinsonism, supranuclear ophthalmoplegia, dysphagia, and frontal lobe dysfunction. The corpus callosum which consists of many commissure fibers probably reflects cerebral cortical function. Several previous reports showed atrophy or diffusion abnormalities of anterior corpus callosum in PSP patients, but partitioning method used in these studies was based on data obtained in nonhuman primates. In this study, we performed a diffusion tensor analysis using a new partitioning method for the human corpus callosum. Seven consecutive patients with PSP were compared with 29 age-matched patients with Parkinson's Disease (PD) and 19 age-matched healthy control subjects. All subjects underwent diffusion tensor magnetic resonance imaging, and the corpus callosum was partitioned into five areas on the mid-sagittal plane according to a recently established topography of human corpus callosum (CC1-prefrontal area, CC2-premotor and supplementary motor area, CC3-motor area, CC4-sensory area, CC5-parietal, temporal, and occipital area). Fractional anisotropy (FA) and apparent diffusion coefficient (ADC) were measured in each area and differences between groups were analyzed. In the PSP group, FA values were significantly decreased in CC1 and CC2, and ADC values were significantly increased in CC1 and CC2. Receiver operating characteristic analysis showed excellent reliability of FA and ADC analyses of CC1 for differentiating PSP from PD. The anterior corpus callosum corresponding to the prefrontal, premotor, and supplementary motor cortices is affected in PSP patients. This analysis can be an additional test for further confirmation of the diagnosis of PSP

  5. Reliability analysis techniques for the design engineer

    International Nuclear Information System (INIS)

    Corran, E.R.; Witt, H.H.

    1980-01-01

    A fault tree analysis package is described that eliminates most of the housekeeping tasks involved in proceeding from the initial construction of a fault tree to the final stage of presenting a reliability analysis in a safety report. It is suitable for designers with relatively little training in reliability analysis and computer operation. Users can rapidly investigate the reliability implications of various options at the design stage, and evolve a system which meets specified reliability objectives. Later independent review is thus unlikely to reveal major shortcomings necessitating modification and projects delays. The package operates interactively allowing the user to concentrate on the creative task of developing the system fault tree, which may be modified and displayed graphically. For preliminary analysis system data can be derived automatically from a generic data bank. As the analysis procedes improved estimates of critical failure rates and test and maintenance schedules can be inserted. The computations are standard, - identification of minimal cut-sets, estimation of reliability parameters, and ranking of the effect of the individual component failure modes and system failure modes on these parameters. The user can vary the fault trees and data on-line, and print selected data for preferred systems in a form suitable for inclusion in safety reports. A case history is given - that of HIFAR containment isolation system. (author)

  6. Recent progress on developments of tritium safe handling techniques in Japan

    International Nuclear Information System (INIS)

    Watanabe, Kuniaki; Matsuyama, Masao

    1993-01-01

    Vast amounts of tritium will be used for thermonuclear fusion reactors. Without establishing safe handling techniques for large amounts of tritium, undoubtedly the fusion reactors will not be accepted. Japanese activity on tritium related research has considerably developed in the last 10 years. This review paper gives a brief summary of safe handling techniques developed by Japanese research groups. (author)

  7. Nucelar reactor seismic safety analysis techniques

    International Nuclear Information System (INIS)

    Cummings, G.E.; Wells, J.E.; Lewis, L.C.

    1979-04-01

    In order to provide insights into the seismic safety requirements for nuclear power plants, a probabilistic based systems model and computational procedure have been developed. This model and computational procedure will be used to identify where data and modeling uncertainties need to be decreased by studying the effect of these uncertainties on the probability of radioactive release and the probability of failure of various structures, systems, and components. From the estimates of failure and release probabilities and their uncertainties the most sensitive steps in the seismic methodologies can be identified. In addition, the procedure will measure the uncertainty due to random occurrences, e.g. seismic event probabilities, material property variability, etc. The paper discusses the elements of this systems model and computational procedure, the event-tree/fault-tree development, and the statistical techniques to be employed

  8. Analysis of Jordanian Cigarettes Using XRF Techniques

    International Nuclear Information System (INIS)

    Kullab, M.; Ismail, A.; AL-kofahi, M.

    2002-01-01

    Sixteen brands of Jordanian cigarettes were analyzed using X-ray Fluorescence (XRF) techniques. These cigarettes were found to contain the elements: Si, S, Cl, K, Ca, P, Ti, Mn, Fe, Cu, Zn, Br.Rb and Sr. The major elements with concentrations of more than 1% by weight were Cl,K and Ca. The elements with minor concentrations, Between 0.1 and 1% by weight, were Si, S and P. The trace elements with concentrations below 0.1% by weight were Ti, Mn, Fe, Cu, Zn, Br, Rb and Sr. The toxicity of some trace elements, like Br, Rb, and Sr, which are present in some brands of Jordanian cigarettes, is discussed. (Author's) 24 refs., 1 tab., 1 fig

  9. Decentralized control using compositional analysis techniques

    NARCIS (Netherlands)

    Kerber, F.; van der Schaft, A. J.

    2011-01-01

    Decentralized control strategies aim at achieving a global control target by means of distributed local controllers acting on individual subsystems of the overall plant. In this sense, decentralized control is a dual problem to compositional analysis where a global verification task is decomposed

  10. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan

    2016-01-01

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social

  11. Evaluating Dynamic Analysis Techniques for Program Comprehension

    NARCIS (Netherlands)

    Cornelissen, S.G.M.

    2009-01-01

    Program comprehension is an essential part of software development and software maintenance, as software must be sufficiently understood before it can be properly modified. One of the common approaches in getting to understand a program is the study of its execution, also known as dynamic analysis.

  12. 10th Australian conference on nuclear techniques of analysis. Proceedings

    International Nuclear Information System (INIS)

    1998-01-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume

  13. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  14. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  15. Nuclear techniques for analysis of environmental samples

    International Nuclear Information System (INIS)

    1986-12-01

    The main purposes of this meeting were to establish the state-of-the-art in the field, to identify new research and development that is required to provide an adequate framework for analysis of environmental samples and to assess needs and possibilities for international cooperation in problem areas. This technical report was prepared on the subject based on the contributions made by the participants. A separate abstract was prepared for each of the 9 papers

  16. Application of activation techniques to biological analysis

    International Nuclear Information System (INIS)

    Bowen, H.J.M.

    1981-01-01

    Applications of activation analysis in the biological sciences are reviewed for the period of 1970 to 1979. The stages and characteristics of activation analysis are described, and its advantages and disadvantages enumerated. Most applications involve activation by thermal neutrons followed by either radiochemical or instrumental determination. Relatively little use has been made of activation by fast neutrons, photons, or charged particles. In vivo analyses are included, but those based on prompt gamma or x-ray emission are not. Major applications include studies of reference materials, and the elemental analysis of plants, marine biota, animal and human tissues, diets, and excreta. Relatively little use of it has been made in biochemistry, microbiology, and entomology, but it has become important in toxicology and environmental science. The elements most often determined are Ag, As, Au, Br, Ca, Cd, Cl, Co, Cr, Cs, Cu, Fe, Hg, I, K, Mn, Mo, Na, Rb, Sb, Sc, Se, and Zn, while few or no determinations of B, Be, Bi, Ga, Gd, Ge, H, In, Ir, Li, Nd, Os, Pd, Pr, Pt, Re, Rh, Ru, Te, Tl, or Y have been made in biological materials

  17. Work in progress. Flashing tomosynthesis: a tomographic technique for quantitative coronary angiography

    International Nuclear Information System (INIS)

    Woelke, H.; Hanrath, P.; Schlueter, M.; Bleifeld, W.; Klotz, E.; Weiss, H.; Waller, D.; von Weltzien, J.

    1982-01-01

    Flashing tomosynthesis, a procedure that consists of a recording step and a reconstruction step, facilitates the tomographic imaging of coronary arteries. In a comparative study 10 postmortem coronary arteriograms were examined with 35-mm cine technique and with flashing tomosynthesis. The degrees of stenosis found with both of these techniques were compared with morphometrically obtained values. A higher correlation coefficient existed for the degrees of stenosis obtained with tomosynthesis and morphometry (r=0.92, p<0.001, SEE=9%) than for those obtained with cine technique and morphometry (r=0.82, p<0.001, SEE=16%). The technique has also been successfully carried out in 5 patients with coronary artery disease

  18. New analytical techniques for cuticle chemical analysis

    International Nuclear Information System (INIS)

    Schulten, H.R.

    1994-01-01

    1) The analytical methodology of pyrolysis-gas chromatography/mass spectrometry (Py-GC/MS) and direct pyrolysis-mass spectrometry (Py-MS) using soft ionization techniques by high electric fields (FL) are briefly described. Recent advances of Py-GC/MS and Py-FIMS for the analyses of complex organic matter such as plant materials, humic substances, dissolved organic matter in water (DOM) and soil organic matter (SOM) in agricultural and forest soils are given to illustrate the potential and limitations of the applied methods. 2) Novel applications of Py-GC/MS and Py-MS in combination with conventional analytical data in an integrated, chemometric approach to investigate the dynamics of plant lipids are reported. This includes multivariate statistical investigations on maturation, senescence, humus genesis, and environmental damages in spruce ecosystems. 3) The focal point is the author's integrated investigations on emission-induced changes of selected conifer plant constituents. Pattern recognition of Py-MS data of desiccated spruce needles provides a method for distinguishing needles damaged in different ways and determining the cause. Spruce needles were collected from both controls and trees treated with sulphur dioxide (acid rain), nitrogen dioxide, and ozone under controlled conditions. Py-MS and chemometric data evaluation are employed to characterize and classify leaves and their epicuticular waxes. Preliminary mass spectrometric evaluations of isolated cuticles of different plants such as spruce, ivy, holly, and philodendron, as well as ivy cuticles treated in vivo with air pollutants such as surfactants and pesticides are given. (orig.)

  19. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    Science.gov (United States)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  20. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  1. A technique for human error analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W.

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions

  2. Recent progress on HYSPEC, and its polarization analysis capabilities

    Directory of Open Access Journals (Sweden)

    Winn Barry

    2015-01-01

    Full Text Available HYSPEC is a high-intensity, direct-geometry time-of-flight spectrometer at the Spallation Neutron Source, optimized for measurement of excitations in small single-crystal specimens with optional polarization analysis capabilities. The incident neutron beam is monochromated using a Fermi chopper with short, straight blades, and is then vertically focused by Bragg scattering onto the sample position by either a highly oriented pyrolitic graphite (unpolarized or a Heusler (polarized crystal array. Neutrons are detected by a bank of 3He tubes that can be positioned over a wide range of scattering angles about the sample axis. HYSPEC entered the user program in February 2013 for unpolarized experiments, and is already experiencing a vibrant research program. Polarization analysis will be accomplished by using the Heusler crystal array to polarize the incident beam, and either a 3He spin filter or a supermirror wide-angle polarization analyser to analyse the scattered beam. The 3He spin filter employs the spin-exchange optical pumping technique. A 60∘ wide angle 3He cell that matches the detector coverage will be used for polarization analysis. The polarized gas in the post-sample wide angle cell is designed to be periodically and automatically refreshed with an adjustable pressure of polarized gas, optically pumped in a separate cell and then transferred to the wide angle cell. The supermirror analyser has 960 supermirror polarizers distributed over 60∘, and has been characterized at the Swiss Spallation Neutron Source. The current status of the instrument and the development of its polarization analysis capabilities are presented.

  3. [Research progress on the technique and materials for three-dimensional bio-printing].

    Science.gov (United States)

    Yang, Runhuai; Chen, Yueming; Ma, Changwang; Wang, Huiqin; Wang, Shuyue

    2017-04-01

    Three-dimensional (3D) bio-printing is a novel engineering technique by which the cells and support materials can be manufactured to a complex 3D structure. Compared with other 3D printing methods, 3D bio-printing should pay more attention to the biocompatible environment of the printing methods and the materials. Aimed at studying the feature of the 3D bio-printing, this paper mainly focuses on the current research state of 3D bio-printing, with the techniques and materials of the bio-printing especially emphasized. To introduce current printing methods, the inkjet method, extrusion method, stereolithography skill and laser-assisted technique are described. The printing precision, process, requirements and influence of all the techniques on cell status are compared. For introduction of the printing materials, the cross-link, biocompatibility and applications of common bio-printing materials are reviewed and compared. Most of the 3D bio-printing studies are being remained at the experimental stage up to now, so the review of 3D bio-printing could improve this technique for practical use, and it could also contribute to the further development of 3D bio-printing.

  4. Multivariate techniques of analysis for ToF-E recoil spectrometry data

    Energy Technology Data Exchange (ETDEWEB)

    Whitlow, H J; Bouanani, M E; Persson, L; Hult, M; Jonsson, P; Johnston, P N [Lund Institute of Technology, Solvegatan, (Sweden), Department of Nuclear Physics; Andersson, M [Uppsala Univ. (Sweden). Dept. of Organic Chemistry; Ostling, M; Zaring, C [Royal institute of Technology, Electrum, Kista, (Sweden), Department of Electronics; Johnston, P N; Bubb, I F; Walker, B R; Stannard, W B [Royal Melbourne Inst. of Tech., VIC (Australia); Cohen, D D; Dytlewski, N [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    Multivariate statistical methods are being developed by the Australian -Swedish Recoil Spectrometry Collaboration for quantitative analysis of the wealth of information in Time of Flight (ToF) and energy dispersive Recoil Spectrometry. An overview is presented of progress made in the use of multivariate techniques for energy calibration, separation of mass-overlapped signals and simulation of ToF-E data. 6 refs., 5 figs.

  5. Multivariate techniques of analysis for ToF-E recoil spectrometry data

    Energy Technology Data Exchange (ETDEWEB)

    Whitlow, H.J.; Bouanani, M.E.; Persson, L.; Hult, M.; Jonsson, P.; Johnston, P.N. [Lund Institute of Technology, Solvegatan, (Sweden), Department of Nuclear Physics; Andersson, M. [Uppsala Univ. (Sweden). Dept. of Organic Chemistry; Ostling, M.; Zaring, C. [Royal institute of Technology, Electrum, Kista, (Sweden), Department of Electronics; Johnston, P.N.; Bubb, I.F.; Walker, B.R.; Stannard, W.B. [Royal Melbourne Inst. of Tech., VIC (Australia); Cohen, D.D.; Dytlewski, N. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Multivariate statistical methods are being developed by the Australian -Swedish Recoil Spectrometry Collaboration for quantitative analysis of the wealth of information in Time of Flight (ToF) and energy dispersive Recoil Spectrometry. An overview is presented of progress made in the use of multivariate techniques for energy calibration, separation of mass-overlapped signals and simulation of ToF-E data. 6 refs., 5 figs.

  6. Securing safe and informative thoracic CT examinations—Progress of radiation dose reduction techniques

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, Takeshi, E-mail: tkubo@kuhp.kyoto-u.ac.jp [Department of Diagnostic Imaging and Nuclear Medicine, Kyoto University Graduate School of Medicine, 54 Shogoin Kawahara-cho, Sakyo-ku, Kyoto 606-8507 (Japan); Ohno, Yoshiharu [Division of Functional and Diagnostic Imaging Research, Department of Radiology, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017 (Japan); Advanced Biomedical Imaging Research Center, Kobe University Graduate School of Medicine, 7-5-2 Kusunoki-cho, Chuo-ku, Kobe 650-0017 (Japan); Seo, Joon Beom [Department of Radiology, University of Ulsan College of Medicine, Asan Medical Center, 88 Olympic-ro 43-gil, Songpa-gu, Seoul 05505 (Korea, Republic of); Yamashiro, Tsuneo [Department of Radiology, Graduate School of Medical Science, University of the Ryukyus, 207 Uehara, Nishinara, Okinawa 903-0215 (Japan); Kalender, Willi A. [Institute of Medical Physics, Friedrich-Alexander-University Erlangen-Nürnberg, Henkestr. 91, 91052 Erlangen (Germany); Lee, Chang Hyun [Department of Radiology, Seoul National University Hospital, 28 Yeongeon-dong, Jongno-gu, Seoul (Korea, Republic of); Lynch, David A. [Department of Radiology, National Jewish Health, 1400 Jackson St, A330 Denver, Colorado 80206 (United States); Kauczor, Hans-Ulrich [Diagnostic and Interventional Radiology, University Hospital Heidelberg, Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Translational Lung Research Center Heidelberg (TLRC), Member of the German Center for Lung Research (DZL), Im Neuenheimer Feld 400, 69120 Heidelberg (Germany); Hatabu, Hiroto, E-mail: hhatabu@partners.org [Center for Pulmonary Functional Imaging, Department of Radiology, Brigham and Women' s Hospital, 75 Francis Street, Boston, MA 02115 (United States)

    2017-01-15

    Highlights: • Various techniques have led to substantial radiation dose reduction of chest CT. • Automatic modulation of tube current has been shown to reduce radiation dose. • Iterative reconstruction makes significant radiation dose reduction possible. • Processing time is a limitation for full iterative reconstruction, currently. • Validation of diagnostic accuracy is desirable for routine use of low dose protocols. - Abstract: The increase in the radiation exposure from CT examinations prompted the investigation on the various dose-reduction techniques. Significant dose reduction has been achieved and the level of radiation exposure of thoracic CT is expected to reach the level equivalent to several chest X-ray examinations. With more scanners with advanced dose reduction capability deployed, knowledge on the radiation dose reduction methods has become essential to clinical practice as well as academic research. This article reviews the history of dose reduction techniques, ongoing changes brought by newer technologies and areas of further investigation.

  7. Development of chemical analysis techniques: pt. 3

    International Nuclear Information System (INIS)

    Kim, K.J.; Chi, K.Y.; Choi, G.C.

    1981-01-01

    For the purpose of determining trace rare earths a spectrofluorimetric method has been studied. Except Ce and Tb, the fluorescence intensities are not enough to allow satisfactory analysis. Complexing agents such as tungstate and hexafluoroacetylacetone should be employed to increase fluorescence intensities. As a preliminary experiment for the separation of individual rare earth element and uranium, the distribution coefficient, % S here, are obtained on the Dowex 50 W against HCl concentration by a batch method. These % S data are utilized to obtain elution curves. The % S data showed a minimum at around 4 M HCl. To understand this previously known phenomenon the adsorption of Cl - on Dowex 50 W is examined as a function of HCl concentration and found to be decreasing while % S of rare earths increasing. It is interpreted that Cl - and rare earth ions are moved into the resin phase separately and that the charge and the charge densities of these ions are responsible for the different % S curves. Dehydration appears to play an important role in the upturn of the % S curves at higher HCl concentrations

  8. Contributions to fuzzy polynomial techniques for stability analysis and control

    OpenAIRE

    Pitarch Pérez, José Luis

    2014-01-01

    The present thesis employs fuzzy-polynomial control techniques in order to improve the stability analysis and control of nonlinear systems. Initially, it reviews the more extended techniques in the field of Takagi-Sugeno fuzzy systems, such as the more relevant results about polynomial and fuzzy polynomial systems. The basic framework uses fuzzy polynomial models by Taylor series and sum-of-squares techniques (semidefinite programming) in order to obtain stability guarantees...

  9. Progress in emerging techniques for characterization of immobilized viable whole-cell biocatalysts

    Czech Academy of Sciences Publication Activity Database

    Bučko, M.; Vikartovská, A.; Schenkmayerová, A.; Tkáč, J.; Filip, J.; Chorvát Jr., D.; Neděla, Vilém; Ansorge-Schumacher, M.B.; Gemeiner, P.

    2017-01-01

    Roč. 71, č. 11 (2017), s. 2309-2324 ISSN 0366-6352 Institutional support: RVO:68081731 Keywords : bioelectrocatalysis * imaging techniques * immobilized whole-cell biocatalyst * multienzyme cascade reactions * online kinetics Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering OBOR OECD: Bioprocessing technologies (industrial processes relying on biological agents to drive the process) biocatalysis, fermentation Impact factor: 1.258, year: 2016

  10. No evidence of real progress in treatment of acute pain, 1993–2012: scientometric analysis

    Directory of Open Access Journals (Sweden)

    Correll DJ

    2014-04-01

    Full Text Available Darin J Correll, Kamen V Vlassakov, Igor Kissin Department of Anesthesiology, Perioperative and Pain Medicine, Brigham and Women's Hospital, Harvard Medical School, Boston, MA, USA Abstract: Over the past 2 decades, many new techniques and drugs for the treatment of acute pain have achieved widespread use. The main aim of this study was to assess the progress in their implementation using scientometric analysis. The following scientometric indices were used: 1 popularity index, representing the share of articles on a specific technique (or a drug relative to all articles in the field of acute pain; 2 index of change, representing the degree of growth in publications on a topic compared to the previous period; and 3 index of expectations, representing the ratio of the number of articles on a topic in the top 20 journals relative to the number of articles in all (>5,000 biomedical journals covered by PubMed. Publications on specific topics (ten techniques and 21 drugs were assessed during four time periods (1993–1997, 1998–2002, 2003–2007, and 2008–2012. In addition, to determine whether the status of routine acute pain management has improved over the past 20 years, we analyzed surveys designed to be representative of the national population that reflected direct responses of patients reporting pain scores. By the 2008–2012 period, popularity index had reached a substantial level (≥5% only with techniques or drugs that were introduced 30–50 years ago or more (epidural analgesia, patient-controlled analgesia, nerve blocks, epidural analgesia for labor or delivery, bupivacaine, and acetaminophen. In 2008–2012, promising (although modest changes of index of change and index of expectations were found only with dexamethasone. Six national surveys conducted for the past 20 years demonstrated an unacceptably high percentage of patients experiencing moderate or severe pain with not even a trend toward outcome improvement. Thus

  11. An operator expansion technique for path integral analysis

    International Nuclear Information System (INIS)

    Tsvetkov, I.V.

    1995-01-01

    A new method of path integral analysis in the framework of a power series technique is presented. The method is based on the operator expansion of an exponential. A regular procedure to calculate the correction terms is found. (orig.)

  12. Search for the top quark using multivariate analysis techniques

    International Nuclear Information System (INIS)

    Bhat, P.C.

    1994-08-01

    The D0 collaboration is developing top search strategies using multivariate analysis techniques. We report here on applications of the H-matrix method to the eμ channel and neural networks to the e+jets channel

  13. Reduced Incidence of Slowly Progressive Heymann Nephritis in Rats Immunized With a Modified Vaccination Technique

    Directory of Open Access Journals (Sweden)

    Arpad Z. Barabas

    2006-01-01

    Full Text Available A slowly progressive Heymann nephritis (SPHN was induced in three groups of rats by weekly injections of a chemically modified renal tubular antigen in an aqueous medium. A control group of rats received the chemically unmodified version of the antigen in an aqueous solution. One group of SPHN rats were pre- and post-treated with weekly injections of IC made up of rKF3 and rarKF3 IgM antibody at antigen excess (MIC (immune complexes [ICs] containing sonicated ultracentrifuged [u/c] rat kidney fraction 3 [rKF3] antigen and IgM antibodies specific against the antigen, at slight antigen excess. One group of SPHN rats were post-treated with MIC 3 weeks after the induction of the disease and one group of SPHN animals received no treatment. The control group of rats received pre- and post-treatment with sonicated u/c rKF3.

  14. Research progress on the brewing techniques of new-type rice wine.

    Science.gov (United States)

    Jiao, Aiquan; Xu, Xueming; Jin, Zhengyu

    2017-01-15

    As a traditional alcoholic beverage, Chinese rice wine (CRW) with high nutritional value and unique flavor has been popular in China for thousands of years. Although traditional production methods had been used without change for centuries, numerous technological innovations in the last decades have greatly impacted on the CRW industry. However, reviews related to the technology research progress in this field are relatively few. This article aimed at providing a brief summary of the recent developments in the new brewing technologies for making CRW. Based on the comparison between the conventional methods and the innovative technologies of CRW brewing, three principal aspects were summarized and sorted, including the innovation of raw material pretreatment, the optimization of fermentation and the reform of sterilization technology. Furthermore, by comparing the advantages and disadvantages of these methods, various issues are addressed related to the prospect of the CRW industry. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Application of accident progression event tree technology to the Savannah River Site Defense Waste Processing Facility SAR analysis

    International Nuclear Information System (INIS)

    Brandyberry, M.D.; Baker, W.H.; Wittman, R.S.; Amos, C.N.

    1993-01-01

    The Accident Analysis in the Safety Analysis Report (SAR) for the Savannah River Site (SRS) Defense Waste Processing Facility (DWPF) has recently undergone an upgrade. Non-reactor SARs at SRS (and other Department of Energy (DOE) sites) use probabilistic techniques to assess the frequency of accidents at their facilities. This paper describes the application of an extension of the Accident Progression Event Tree (APET) approach to accidents at the SRS DWPF. The APET technique allows an integrated model of the facility risk to be developed, where previous probabilistic accident analyses have been limited to the quantification of the frequency and consequences of individual accident scenarios treated independently. Use of an APET allows a more structured approach, incorporating both the treatment of initiators that are common to more than one accident, and of accident progression at the facility

  16. Laser desorption ionization mass spectrometry: Recent progress in matrix-free and label-assisted techniques.

    Science.gov (United States)

    Mandal, Arundhoti; Singha, Monisha; Addy, Partha Sarathi; Basak, Amit

    2017-10-13

    The MALDI-based mass spectrometry, over the last three decades, has become an important analytical tool. It is a gentle ionization technique, usually applicable to detect and characterize analytes with high molecular weights like proteins and other macromolecules. The earlier difficulty of detection of analytes with low molecular weights like small organic molecules and metal ion complexes with this technique arose due to the cluster of peaks in the low molecular weight region generated from the matrix. To detect such molecules and metal ion complexes, a four-prong strategy has been developed. These include use of alternate matrix materials, employment of new surface materials that require no matrix, use of metabolites that directly absorb the laser light, and the laser-absorbing label-assisted LDI-MS (popularly known as LALDI-MS). This review will highlight the developments with all these strategies with a special emphasis on LALDI-MS. © 2017 Wiley Periodicals, Inc.

  17. Neutron activation analysis: an emerging technique for conservation/preservation

    International Nuclear Information System (INIS)

    Sayre, E.V.

    1976-01-01

    The diverse applications of neutron activation in analysis, preservation, and documentation of art works and artifacts are described with illustrations for each application. The uses of this technique to solve problems of attribution and authentication, to reveal the inner structure and composition of art objects, and, in some instances to recreate details of the objects are described. A brief discussion of the theory and techniques of neutron activation analysis is also included

  18. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  19. Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer

    Science.gov (United States)

    2017-09-01

    AWARD NUMBER: W81XWH-14-1-0080 TITLE: Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer . PRINCIPAL INVESTIGATOR...TITLE AND SUBTITLE Total RNA Sequencing Analysis of DCIS Progressing to Invasive Breast Cancer . 5a. CONTRACT NUMBER 5b. GRANT NUMBER GRANT11489...institutional, NIH-funded study of genetic and epigenetic alterations of pre-invasive DCIS that did or did not progress to invasive breast cancer , with an

  20. Research on digital multi-channel pulse height analysis techniques

    International Nuclear Information System (INIS)

    Xiao Wuyun; Wei Yixiang; Ai Xianyun; Ao Qi

    2005-01-01

    Multi-channel pulse height analysis techniques are developing in the direction of digitalization. Based on digital signal processing techniques, digital multi-channel analyzers are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper analyzes key techniques of digital nuclear pulse processing. With MATLAB software, main algorithms are simulated, such as trapezoidal shaping, digital baseline estimation, digital pole-zero/zero-pole compensation, poles and zeros identification. The preliminary general scheme of digital MCA is discussed, as well as some other important techniques about its engineering design. All these lay the foundation of developing homemade digital nuclear spectrometers. (authors)

  1. Visual and statistical analysis of {sup 18}F-FDG PET in primary progressive aphasia

    Energy Technology Data Exchange (ETDEWEB)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge [Hospital Clinico San Carlos, Department of Neurology, Madrid (Spain); Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis [San Carlos Health Research Institute (IdISSC) Complutense University of Madrid, Department of Nuclear Medicine, Hospital Clinico San Carlos, Madrid (Spain)

    2015-05-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  2. Visual and statistical analysis of 18F-FDG PET in primary progressive aphasia

    International Nuclear Information System (INIS)

    Matias-Guiu, Jordi A.; Moreno-Ramos, Teresa; Garcia-Ramos, Rocio; Fernandez-Matarrubia, Marta; Oreja-Guevara, Celia; Matias-Guiu, Jorge; Cabrera-Martin, Maria Nieves; Perez-Castejon, Maria Jesus; Rodriguez-Rey, Cristina; Ortega-Candil, Aida; Carreras, Jose Luis

    2015-01-01

    Diagnosing progressive primary aphasia (PPA) and its variants is of great clinical importance, and fluorodeoxyglucose (FDG) positron emission tomography (PET) may be a useful diagnostic technique. The purpose of this study was to evaluate interobserver variability in the interpretation of FDG PET images in PPA as well as the diagnostic sensitivity and specificity of the technique. We also aimed to compare visual and statistical analyses of these images. There were 10 raters who analysed 44 FDG PET scans from 33 PPA patients and 11 controls. Five raters analysed the images visually, while the other five used maps created using Statistical Parametric Mapping software. Two spatial normalization procedures were performed: global mean normalization and cerebellar normalization. Clinical diagnosis was considered the gold standard. Inter-rater concordance was moderate for visual analysis (Fleiss' kappa 0.568) and substantial for statistical analysis (kappa 0.756-0.881). Agreement was good for all three variants of PPA except for the nonfluent/agrammatic variant studied with visual analysis. The sensitivity and specificity of each rater's diagnosis of PPA was high, averaging 87.8 and 89.9 % for visual analysis and 96.9 and 90.9 % for statistical analysis using global mean normalization, respectively. In cerebellar normalization, sensitivity was 88.9 % and specificity 100 %. FDG PET demonstrated high diagnostic accuracy for the diagnosis of PPA and its variants. Inter-rater concordance was higher for statistical analysis, especially for the nonfluent/agrammatic variant. These data support the use of FDG PET to evaluate patients with PPA and show that statistical analysis methods are particularly useful for identifying the nonfluent/agrammatic variant of PPA. (orig.)

  3. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  4. Development of environmental sample analysis techniques for safeguards

    International Nuclear Information System (INIS)

    Magara, Masaaki; Hanzawa, Yukiko; Esaka, Fumitaka

    1999-01-01

    JAERI has been developing environmental sample analysis techniques for safeguards and preparing a clean chemistry laboratory with clean rooms. Methods to be developed are a bulk analysis and a particle analysis. In the bulk analysis, Inductively-Coupled Plasma Mass Spectrometer or Thermal Ionization Mass Spectrometer are used to measure nuclear materials after chemical treatment of sample. In the particle analysis, Electron Probe Micro Analyzer and Secondary Ion Mass Spectrometer are used for elemental analysis and isotopic analysis, respectively. The design of the clean chemistry laboratory has been carried out and construction will be completed by the end of March, 2001. (author)

  5. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  6. Nuclear techniques for bulk and surface analysis of materials

    International Nuclear Information System (INIS)

    D'Agostino, M.D.; Kamykowski, E.A.; Kuehne, F.J.; Padawer, G.M.; Schneid, E.J.; Schulte, R.L.; Stauber, M.C.; Swanson, F.R.

    1978-01-01

    A review is presented summarizing several nondestructive bulk and surface analysis nuclear techniques developed in the Grumman Research Laboratories. Bulk analysis techniques include 14-MeV-neutron activation analysis and accelerator-based neutron radiography. The surface analysis techniques include resonant and non-resonant nuclear microprobes for the depth profile analysis of light elements (H, He, Li, Be, C, N, O and F) in the surface of materials. Emphasis is placed on the description and discussion of the unique nuclear microprobe analytical capacibilities of immediate importance to a number of current problems facing materials specialists. The resolution and contrast of neutron radiography was illustrated with an operating heat pipe system. The figure shows that the neutron radiograph has a resolution of better than 0.04 cm with sufficient contrast to indicate Freon 21 on the inner capillaries of the heat pipe and pooling of the liquid at the bottom. (T.G.)

  7. The expanding techniques of progress: Agricultural biotechnology and UN-REDD+

    NARCIS (Netherlands)

    Dunlap, A.D.

    2015-01-01

    This paper provides a comparative analysis of agricultural biotechnology and the United Nations program for reduced emissions from deforestation and forest degradation (REDD). Despite the existing differences between the technical manipulation of biological systems and a conservation program aimed

  8. Development and application of the electrochemical etching technique. Annual progress report

    International Nuclear Information System (INIS)

    1979-08-01

    This report documents advances in the development and application of the electrochemical etching technique for thermal and epithermal neutron dosimetry as well as track geometry determinations. The bulk and track etching rates were studied by evaluating the track geometry during electrochemical etching. The foil surface removed versus etching time for two different etchants at 1000 V, 2 kHz, and 22 0 C were studied. Results indicated that the bulk etching rates were constant for the two etchants, i.e. 45% KOH and 45% KOH mixed with an equal volume of C 2 H 5 OH 5 and were equal to 0.20 +- 0.14 μm/hr and 2.7 +- 0.27 μm/hr from each side of the foil. The track etching rate (as contrasted with the bulk etching rate) can be determined by the microscope focus at various depths. The increase of track depth values as a function of etching time for the two etchants are plotted. The track cone angles were determined and found to be much larger for electrochemically etched polycarbonate foils than for most plastics etched with passive chemical techniques

  9. Progress of new label-free techniques for biosensors: a review.

    Science.gov (United States)

    Sang, Shengbo; Wang, Yajun; Feng, Qiliang; Wei, Ye; Ji, Jianlong; Zhang, Wendong

    2016-01-01

    The detection techniques used in biosensors can be broadly classified into label-based and label-free. Label-based detection relies on the specific properties of labels for detecting a particular target. In contrast, label-free detection is suitable for the target molecules that are not labeled or the screening of analytes which are not easy to tag. Also, more types of label-free biosensors have emerged with developments in biotechnology. The latest developed techniques in label-free biosensors, such as field-effect transistors-based biosensors including carbon nanotube field-effect transistor biosensors, graphene field-effect transistor biosensors and silicon nanowire field-effect transistor biosensors, magnetoelastic biosensors, optical-based biosensors, surface stress-based biosensors and other type of biosensors based on the nanotechnology are discussed. The sensing principles, configurations, sensing performance, applications, advantages and restriction of different label-free based biosensors are considered and discussed in this review. Most concepts included in this survey could certainly be applied to the development of this kind of biosensor in the future.

  10. Assessment of Intralaminar Progressive Damage and Failure Analysis Using an Efficient Evaluation Framework

    Science.gov (United States)

    Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl

    2017-01-01

    Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.

  11. Recent Progress of Fabrication of Cell Scaffold by Electrospinning Technique for Articular Cartilage Tissue Engineering

    Directory of Open Access Journals (Sweden)

    Yingge Zhou

    2018-01-01

    Full Text Available As a versatile nanofiber manufacturing technique, electrospinning has been widely employed for the fabrication of tissue engineering scaffolds. Since the structure of natural extracellular matrices varies substantially in different tissues, there has been growing awareness of the fact that the hierarchical 3D structure of scaffolds may affect intercellular interactions, material transportation, fluid flow, environmental stimulation, and so forth. Physical blending of the synthetic and natural polymers to form composite materials better mimics the composition and mechanical properties of natural tissues. Scaffolds with element gradient, such as growth factor gradient, have demonstrated good potentials to promote heterogeneous cell growth and differentiation. Compared to 2D scaffolds with limited thicknesses, 3D scaffolds have superior cell differentiation and development rate. The objective of this review paper is to review and discuss the recent trends of electrospinning strategies for cartilage tissue engineering, particularly the biomimetic, gradient, and 3D scaffolds, along with future prospects of potential clinical applications.

  12. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  13. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Proposal analysis techniques. 15.404-1 Section 15.404-1 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION... assistance of other experts to ensure that an appropriate analysis is performed. (6) Recommendations or...

  14. NMR and modelling techniques in structural and conformation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, R J [Liverpool Univ. (United Kingdom)

    1994-12-31

    The use of Lanthanide Induced Shifts (L.I.S.) and modelling techniques in conformational analysis is presented. The use of Co{sup III} porphyrins as shift reagents is discussed, with examples of their use in the conformational analysis of some heterocyclic amines. (author) 13 refs., 9 figs.

  15. Application of nuclear analysis techniques in ancient chinese porcelain

    International Nuclear Information System (INIS)

    Feng Songlin; Xu Qing; Feng Xiangqian; Lei Yong; Cheng Lin; Wang Yanqing

    2005-01-01

    Ancient ceramic was fired with porcelain clay. It contains various provenance information and age characteristic. It is the scientific foundation of studying Chinese porcelain to analyze and research the ancient ceramic with modern analysis methods. According to the property of nuclear analysis technique, its function and application are discussed. (authors)

  16. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  17. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  18. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  19. Thermodynamic Activity-Based Progress Curve Analysis in Enzyme Kinetics.

    Science.gov (United States)

    Pleiss, Jürgen

    2018-03-01

    Macrokinetic Michaelis-Menten models based on thermodynamic activity provide insights into enzyme kinetics because they separate substrate-enzyme from substrate-solvent interactions. Kinetic parameters are estimated from experimental progress curves of enzyme-catalyzed reactions. Three pitfalls are discussed: deviations between thermodynamic and concentration-based models, product effects on the substrate activity coefficient, and product inhibition. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Analysis of interventional therapy for progressing stage gastric cancer

    International Nuclear Information System (INIS)

    Zhu Mingde; Zhang Zijing; Ji Hongsheng; Ge Chenlin; Hao Gang; Wei Kongming; Yuan Yuhou; Zhao Xiuping

    2008-01-01

    Objective: To investigate the interventional therapy and its curative effect for progressing stage gastric cancer. Methods: two hundred and twelve patients with progressing stage gastric cancer were treated with arterial perfusion and arterial embolization. Gastric cardia cancer was treated through the left gastric artery and the left inferior phrenic artery or splenic artery. Cancers of lesser and greater gastric curvature was treated either through the left and right gastric arteries or common hepatic artery or through gastroduodenal artery, right gastroomental artery or splenic artery. Gastric antrum cancers were perfused through gastroduodenal artery or after the middle segmental embolization of right gastroomental artery. Results: One hundred and ninety three cases undergone interventional management were followed up. The CR + PR of gastric cardia cancer was 53.13%; gastric body cancer 44.44%; gastric antrum cancer 10%; recurrent cancer and remnant gastric cancer 0. There was no significant difference in outcome between gastric cardia cancer and gastric body cancer (P>0.05) but significant differences were shown both between gastric cardia cancer and gastric antrum cancer, and between gastric body cancer and gastric antrum cancer (P<0.05), with 1 year and 2 years survival rates of 81% and 56% respectively. Conclusion: The interventional therapeutic effect of progressing stage gastric cancers is different due to the different sites of the lesions in the gastric tissue. The curative effect of gastric cardia cancer and gastric body cancer is better than that of gastric antrum cancer, recurrent cancer and remnant gastric cancer. (authors)

  1. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  2. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  3. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J R; Hutton, J T; Habermehl, M A [Adelaide Univ., SA (Australia); Van Moort, J [Tasmania Univ., Sandy Bay, TAS (Australia)

    1997-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  4. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  5. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  6. Critical analysis of procurement techniques in construction management sectors

    Science.gov (United States)

    Tiwari, Suman Tiwari Suresh; Chan, Shiau Wei; Faraz Mubarak, Muhammad

    2018-04-01

    Over the last three decades, numerous procurement techniques have been one of the highlights of the Construction Management (CM) for ventures, administration contracting, venture management as well as design and construct. Due to the development and utilization of those techniques, various researchers have explored the criteria for their choice and their execution in terms of time, cost and quality. Nevertheless, there is a lack of giving an account on the relationship between the procurement techniques and the progressed related issues, for example, supply chain, sustainability, innovation and technology development, lean construction, constructability, value management, Building Information Modelling (BIM) as well as e-procurement. Through chosen papers from the reputable CM-related academic journals, the specified scopes of these issues are methodically assessed with the objective to explore the status and trend in procurement related research. The result of this paper contributes theoretically as well as practically to the researchers and industrialist in order to be aware and appreciate the development of procurement techniques.

  7. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  8. Review and classification of variability analysis techniques with clinical applications

    Science.gov (United States)

    2011-01-01

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis. PMID:21985357

  9. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  10. Using Machine Learning Techniques in the Analysis of Oceanographic Data

    Science.gov (United States)

    Falcinelli, K. E.; Abuomar, S.

    2017-12-01

    Acoustic Doppler Current Profilers (ADCPs) are oceanographic tools capable of collecting large amounts of current profile data. Using unsupervised machine learning techniques such as principal component analysis, fuzzy c-means clustering, and self-organizing maps, patterns and trends in an ADCP dataset are found. Cluster validity algorithms such as visual assessment of cluster tendency and clustering index are used to determine the optimal number of clusters in the ADCP dataset. These techniques prove to be useful in analysis of ADCP data and demonstrate potential for future use in other oceanographic applications.

  11. Progress on radiochemical analysis for nuclear waste management in decommissioning

    Energy Technology Data Exchange (ETDEWEB)

    Hou, X. (Technical Univ. of Denmark. Center for Nuclear Technologies (NuTech), Roskilde (Denmark))

    2012-01-15

    This report summarized the progress in the development and improvement of radioanalytical methods for decommissioning and waste management completed in the NKS-B RadWaste 2011 project. Based on the overview information of the analytical methods in Nordic laboratories and requirement from the nuclear industry provided in the first phase of the RadWaste project (2010), some methods were improved and developed. A method for efficiently separation of Nb from nuclear waste especially metals for measurement of long-lived 94Nb by gamma spectrometry was developed. By systematic investigation of behaviours of technetium in sample treatment and chromatographic separation process, an effective method was developed for the determination of low level 99Tc in waste samples. An AMS approachment was investigated to measure ultra low level 237Np using 242Pu for AMS normalization, the preliminary results show a high potential of this method. Some progress on characterization of waste for decommissioning of Danish DR3 is also presented. (Author)

  12. Progress on radiochemical analysis for nuclear waste management in decommissioning

    International Nuclear Information System (INIS)

    Hou, X.

    2012-01-01

    This report summarized the progress in the development and improvement of radioanalytical methods for decommissioning and waste management completed in the NKS-B RadWaste 2011 project. Based on the overview information of the analytical methods in Nordic laboratories and requirement from the nuclear industry provided in the first phase of the RadWaste project (2010), some methods were improved and developed. A method for efficiently separation of Nb from nuclear waste especially metals for measurement of long-lived 94Nb by gamma spectrometry was developed. By systematic investigation of behaviours of technetium in sample treatment and chromatographic separation process, an effective method was developed for the determination of low level 99Tc in waste samples. An AMS approachment was investigated to measure ultra low level 237Np using 242Pu for AMS normalization, the preliminary results show a high potential of this method. Some progress on characterization of waste for decommissioning of Danish DR3 is also presented. (Author)

  13. Supracapsular glued intraocular lens in progressive subluxated cataracts: Technique to retain an intact vitreous face.

    Science.gov (United States)

    Jacob, Soosan; Narasimhan, Smita; Agarwal, Amar; Mazzotta, Cosimo; Rechichi, Miguel; Agarwal, Athiya

    2017-03-01

    We describe a technique to prevent late intraocular lens (IOL) subluxation and dislocation that can be associated with progressive zonulopathy. Supracapsular glued IOL fixation is done to retain an intact anterior hyaloid face and avoid vitreous disturbance while providing stable long-term IOL fixation. Phacoemulsification is followed by glued IOL implantation above intact anterior and posterior capsules. Sclerotomies are created ab interno in a supracapsular plane under diametrically opposite lamellar scleral flaps without entering the vitreous cavity. Haptics are externalized in the supracapsular plane and tucked into intrascleral tunnels. Intraoperative or postoperative posterior capsulorhexis or capsulotomy and anterior capsule relaxing cuts can prevent capsule phimosis. Copyright © 2017 ASCRS and ESCRS. Published by Elsevier Inc. All rights reserved.

  14. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  15. Conference on Techniques of Nuclear and Conventional Analysis and Applications

    International Nuclear Information System (INIS)

    2012-01-01

    Full text : With their wide scope, particularly in the areas of environment, geology, mining, industry and life sciences; analysis techniques are of great importance in research as fundamental and applied. The Conference on Techniques for Nuclear and Conventional Analysis and Applications (TANCA) are Registered in the national strategy of opening of the University and national research centers on their local, national and international levels. This conference aims to: Promoting nuclear and conventional analytical techniques; Contribute to the creation of synergy between the different players involved in these techniques include, Universities, Research Organizations, Regulatory Authorities, Economic Operators, NGOs and others; Inform and educate potential users of the performance of these techniques; Strengthen exchanges and links between researchers, industry and policy makers; Implement a program of inter-laboratory comparison between Moroccan one hand, and their foreign counterparts on the other; Contribute to the research training of doctoral students and postdoctoral scholars. Given the relevance and importance of the issues related to environment and impact on cultural heritage, this fourth edition of TANCA is devoted to the application of analytical techniques for conventional and nuclear Questions ied to environment and its impact on cultural heritage.

  16. The application of value analysis techniques for complex problems

    International Nuclear Information System (INIS)

    Chiquelin, W.R.; Cossel, S.C.; De Jong, V.J.; Halverson, T.W.

    1986-01-01

    This paper discusses the application of the Value Analysis technique to the transuranic package transporter (TRUPACT). A team representing five different companies or organizations with diverse technical backgrounds was formed to analyze and recommend improvements. The results were a 38% systems-wide savings, if incorporated, and a shipping container which is volumetrically and payload efficient as well as user friendly. The Value Analysis technique is a proven tool widely used in many diverse areas both in the government and the private sector. Value Analysis uses functional diagramming of a piece of equipment or process to discretely identify every facet of the item being analyzed. A standard set of questions is then asked: What is it?, What does it do?, What does it cost?, What else will do the task?, and What would that cost? Using logic and a disciplined approach, the result of the Value Analysis performs the necessary functions at a high quality and the lowest overall cost

  17. A comparative analysis of soft computing techniques for gene prediction.

    Science.gov (United States)

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Hierarchical cluster analysis of progression patterns in open-angle glaucoma patients with medical treatment.

    Science.gov (United States)

    Bae, Hyoung Won; Rho, Seungsoo; Lee, Hye Sun; Lee, Naeun; Hong, Samin; Seong, Gong Je; Sung, Kyung Rim; Kim, Chan Yun

    2014-04-29

    To classify medically treated open-angle glaucoma (OAG) by the pattern of progression using hierarchical cluster analysis, and to determine OAG progression characteristics by comparing clusters. Ninety-five eyes of 95 OAG patients who received medical treatment, and who had undergone visual field (VF) testing at least once per year for 5 or more years. OAG was classified into subgroups using hierarchical cluster analysis based on the following five variables: baseline mean deviation (MD), baseline visual field index (VFI), MD slope, VFI slope, and Glaucoma Progression Analysis (GPA) printout. After that, other parameters were compared between clusters. Two clusters were made after a hierarchical cluster analysis. Cluster 1 showed -4.06 ± 2.43 dB baseline MD, 92.58% ± 6.27% baseline VFI, -0.28 ± 0.38 dB per year MD slope, -0.52% ± 0.81% per year VFI slope, and all "no progression" cases in GPA printout, whereas cluster 2 showed -8.68 ± 3.81 baseline MD, 77.54 ± 12.98 baseline VFI, -0.72 ± 0.55 MD slope, -2.22 ± 1.89 VFI slope, and seven "possible" and four "likely" progression cases in GPA printout. There were no significant differences in age, sex, mean IOP, central corneal thickness, and axial length between clusters. However, cluster 2 included more high-tension glaucoma patients and used a greater number of antiglaucoma eye drops significantly compared with cluster 1. Hierarchical cluster analysis of progression patterns divided OAG into slow and fast progression groups, evidenced by assessing the parameters of glaucomatous progression in VF testing. In the fast progression group, the prevalence of high-tension glaucoma was greater and the number of antiglaucoma medications administered was increased versus the slow progression group. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  19. Comparing dynamical systems concepts and techniques for biomechanical analysis

    OpenAIRE

    van Emmerik, Richard E.A.; Ducharme, Scott W.; Amado, Avelino C.; Hamill, Joseph

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new stat...

  20. Reliability Analysis Techniques for Communication Networks in Nuclear Power Plant

    International Nuclear Information System (INIS)

    Lim, T. J.; Jang, S. C.; Kang, H. G.; Kim, M. C.; Eom, H. S.; Lee, H. J.

    2006-09-01

    The objectives of this project is to investigate and study existing reliability analysis techniques for communication networks in order to develop reliability analysis models for nuclear power plant's safety-critical networks. It is necessary to make a comprehensive survey of current methodologies for communication network reliability. Major outputs of this study are design characteristics of safety-critical communication networks, efficient algorithms for quantifying reliability of communication networks, and preliminary models for assessing reliability of safety-critical communication networks

  1. Techniques for getting the most from an evaluation: Review of methods and results for attributing progress, non-energy benefits, net to gross, and cost-benefit

    International Nuclear Information System (INIS)

    Skumatz, Lisa A.

    2005-01-01

    As background for several evaluation and attribution projects, the authors conducted research on best practices in a few key areas of evaluation. We focused on techniques used in measuring market progress, enhanced techniques in attributing net energy impacts, and examining omitted program effects, particularly net non-energy benefits. The research involved a detailed literature review, interviews with program managers and evaluators across the US, and refinements of techniques used by the authors in conducting evaluation work. The object of the research was to uncover successful (and unsuccessful) approaches being used for key aspects of evaluation work. The research uncovered areas of tracking that are becoming more commonly used by agencies to assess progress in the market. In addition, detailed research by the authors on a number of impact and attribution evaluations have also led to recommendations on key practices that we believe comprise elements of best practices for assessments of attributable program effects. Specifically, we have identified a number of useful steps to improve the attribution of impacts to program interventions. Information on techniques for both attribution/causality work for a number of programs are presented - including market transformation programs that rely on marketing, advertising, training, and mid-stream incentives and work primarily with a network of participating mid-market actors. The project methods and results are presented and include: Theory-based evaluation, indicators, and hypothesis testing; Enhanced measurement of free riders, spillover, and other effects, and attribution of impacts using distribution and ranges of measure and intervention impacts, rather than less reliable point estimates; Attribution of program-induced non-energy benefits; Net to gross, benefit cost analysis, and incorporation of scenario/risk analysis of results; Comparison of net to gross results across program types to explore patterns and

  2. Techniques for getting the most from an evaluation: Review of methods and results for attributing progress, non-energy benefits, net to gross, and cost-benefit

    Energy Technology Data Exchange (ETDEWEB)

    Skumatz, Lisa A. [Skumatz Economic Research Associates, Inc., Superior, CO (United States)

    2005-07-01

    As background for several evaluation and attribution projects, the authors conducted research on best practices in a few key areas of evaluation. We focused on techniques used in measuring market progress, enhanced techniques in attributing net energy impacts, and examining omitted program effects, particularly net non-energy benefits. The research involved a detailed literature review, interviews with program managers and evaluators across the US, and refinements of techniques used by the authors in conducting evaluation work. The object of the research was to uncover successful (and unsuccessful) approaches being used for key aspects of evaluation work. The research uncovered areas of tracking that are becoming more commonly used by agencies to assess progress in the market. In addition, detailed research by the authors on a number of impact and attribution evaluations have also led to recommendations on key practices that we believe comprise elements of best practices for assessments of attributable program effects. Specifically, we have identified a number of useful steps to improve the attribution of impacts to program interventions. Information on techniques for both attribution/causality work for a number of programs are presented - including market transformation programs that rely on marketing, advertising, training, and mid-stream incentives and work primarily with a network of participating mid-market actors. The project methods and results are presented and include: Theory-based evaluation, indicators, and hypothesis testing; Enhanced measurement of free riders, spillover, and other effects, and attribution of impacts using distribution and ranges of measure and intervention impacts, rather than less reliable point estimates; Attribution of program-induced non-energy benefits; Net to gross, benefit cost analysis, and incorporation of scenario/risk analysis of results; Comparison of net to gross results across program types to explore patterns and

  3. Fukushima Daiichi Unit 1 Uncertainty Analysis-Exploration of Core Melt Progression Uncertain Parameters-Volume II.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brooks, Dusty Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    Sandia National Laboratories (SNL) has conducted an uncertainty analysi s (UA) on the Fukushima Daiichi unit (1F1) accident progression wit h the MELCOR code. Volume I of the 1F1 UA discusses the physical modeling details and time history results of the UA. Volume II of the 1F1 UA discusses the statistical viewpoint. The model used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). The goal of this work was to perform a focused evaluation of uncertainty in core damage progression behavior and its effect on key figures - of - merit (e.g., hydrogen production, fraction of intact fuel, vessel lower head failure) and in doing so assess the applicability of traditional sensitivity analysis techniques .

  4. Analytical techniques for wine analysis: An African perspective; a review

    International Nuclear Information System (INIS)

    Villiers, André de; Alberts, Phillipus; Tredoux, Andreas G.J.; Nieuwoudt, Hélène H.

    2012-01-01

    Highlights: ► Analytical techniques developed for grape and wine analysis in Africa are reviewed. ► The utility of infrared spectroscopic methods is demonstrated. ► An overview of separation of wine constituents by GC, HPLC, CE is presented. ► Novel LC and GC sample preparation methods for LC and GC are presented. ► Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  5. Analytical techniques for wine analysis: An African perspective; a review

    Energy Technology Data Exchange (ETDEWEB)

    Villiers, Andre de, E-mail: ajdevill@sun.ac.za [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Alberts, Phillipus [Department of Chemistry and Polymer Science, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa); Tredoux, Andreas G.J.; Nieuwoudt, Helene H. [Institute for Wine Biotechnology, Department of Viticulture and Oenology, Stellenbosch University, Private Bag X1, Matieland 7602, Stellenbosch (South Africa)

    2012-06-12

    Highlights: Black-Right-Pointing-Pointer Analytical techniques developed for grape and wine analysis in Africa are reviewed. Black-Right-Pointing-Pointer The utility of infrared spectroscopic methods is demonstrated. Black-Right-Pointing-Pointer An overview of separation of wine constituents by GC, HPLC, CE is presented. Black-Right-Pointing-Pointer Novel LC and GC sample preparation methods for LC and GC are presented. Black-Right-Pointing-Pointer Emerging methods for grape and wine analysis in Africa are discussed. - Abstract: Analytical chemistry is playing an ever-increasingly important role in the global wine industry. Chemical analysis of wine is essential in ensuring product safety and conformity to regulatory laws governing the international market, as well as understanding the fundamental aspects of grape and wine production to improve manufacturing processes. Within this field, advanced instrumental analysis methods have been exploited more extensively in recent years. Important advances in instrumental analytical techniques have also found application in the wine industry. This review aims to highlight the most important developments in the field of instrumental wine and grape analysis in the African context. The focus of this overview is specifically on the application of advanced instrumental techniques, including spectroscopic and chromatographic methods. Recent developments in wine and grape analysis and their application in the African context are highlighted, and future trends are discussed in terms of their potential contribution to the industry.

  6. Evolution of the sedimentation technique for particle size distribution analysis

    International Nuclear Information System (INIS)

    Maley, R.

    1998-01-01

    After an introduction on the significance of particle size measurements, sedimentation methods are described, with emphasis on the evolution of the gravitational approach. The gravitational technique based on mass determination by X-ray adsorption allows fast analysis by automation and easy data handling, in addition to providing the accuracy required by quality control and research applications [it

  7. Comparative Analysis of Some Techniques in the Biological ...

    African Journals Online (AJOL)

    The experiments involved the simulation of conditions of a major spill by pouring crude oil on the cells from perforated cans and the in-situ bioremediation of the polluted soils using the techniques that consisted in the manipulation of different variables within the soil environment. The analysis of soil characteristics after a ...

  8. Tailored Cloze: Improved with Classical Item Analysis Techniques.

    Science.gov (United States)

    Brown, James Dean

    1988-01-01

    The reliability and validity of a cloze procedure used as an English-as-a-second-language (ESL) test in China were improved by applying traditional item analysis and selection techniques. The 'best' test items were chosen on the basis of item facility and discrimination indices, and were administered as a 'tailored cloze.' 29 references listed.…

  9. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  10. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  11. Study of analysis techniques of thermoluminescent dosimeters response

    International Nuclear Information System (INIS)

    Castro, Walber Amorim

    2002-01-01

    The Personal Monitoring Service of the Centro Regional de Ciencias Nucleares uses in its dosemeter the TLD 700 material . The TLD's analysis is carried out using a Harshaw-Bicron model 6600 automatic reading system. This system uses dry air instead of the traditional gaseous nitrogen. This innovation brought advantages to the service but introduced uncertainties in the reference of the detectors; one of these was observed for doses below 0,5 mSv. In this work different techniques of analysis of the TLD response were investigated and compared, involving dose values in this interval. These techniques include thermal pre-treatment, and different kinds of the glow curves analysis methods were investigated. Obtained results showed the necessity of developing a specific software that permits the automatic background subtraction for the glow curves for each dosemeter . This software was developed and it bean tested. Preliminary results showed the software increase the response reproducibility. (author)

  12. Sensitivity analysis technique for application to deterministic models

    International Nuclear Information System (INIS)

    Ishigami, T.; Cazzoli, E.; Khatib-Rahbar, M.; Unwin, S.D.

    1987-01-01

    The characterization of sever accident source terms for light water reactors should include consideration of uncertainties. An important element of any uncertainty analysis is an evaluation of the sensitivity of the output probability distributions reflecting source term uncertainties to assumptions regarding the input probability distributions. Historically, response surface methods (RSMs) were developed to replace physical models using, for example, regression techniques, with simplified models for example, regression techniques, with simplified models for extensive calculations. The purpose of this paper is to present a new method for sensitivity analysis that does not utilize RSM, but instead relies directly on the results obtained from the original computer code calculations. The merits of this approach are demonstrated by application of the proposed method to the suppression pool aerosol removal code (SPARC), and the results are compared with those obtained by sensitivity analysis with (a) the code itself, (b) a regression model, and (c) Iman's method

  13. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  14. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  15. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  16. Development of neural network techniques for the analysis of JET ECE data

    International Nuclear Information System (INIS)

    Bartlett, D.V.; Bishop, C.M.

    1993-01-01

    This paper reports on a project currently in progress to develop neutral network techniques for the conversion of JET ECE spectra to electron temperature profiles. The aim is to obtain profiles with reduced measurement uncertainties by incorporating data from the LIDAR Thomson scattering diagnostic in the analysis, while retaining the faster time resolution of the ECE measurements. The properties of neural networks are briefly reviewed, and the reasons for using them in this application are explained. Some preliminary results are presented and the direction of future work is outlined. (orig.)

  17. Practical applications of activation analysis and other nuclear techniques

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of γ rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed

  18. Maximum entropy technique in the doublet structure analysis

    International Nuclear Information System (INIS)

    Belashev, B.Z.; Panebrattsev, Yu.A.; Shakhaliev, Eh.I.; Soroko, L.M.

    1998-01-01

    The Maximum Entropy Technique (MENT) for solution of the inverse problems is explained. The effective computer program for resolution of the nonlinear equations system encountered in the MENT has been developed and tested. The possibilities of the MENT have been demonstrated on the example of the MENT in the doublet structure analysis of noisy experimental data. The comparison of the MENT results with results of the Fourier algorithm technique without regularization is presented. The tolerant noise level is equal to 30% for MENT and only 0.1% for the Fourier algorithm

  19. Progress in structural analysis of glycosaminoglycans and their ...

    African Journals Online (AJOL)

    Yomi

    2012-03-06

    Mar 6, 2012 ... The rising interest in the application of glycosaminoglycans (GAGs) is ... in functional food, clinical medicine, cosmetics and biomaterial. .... labor leads to incomplete hydrolysis and lower results. In ... GAG are spectrum analysis combined with enzymolysis ... spectrometry (ESI-FTMS), and nuclear magnetic.

  20. Progress of the DUPIC fuel compatibility analysis (I) - reactor physics

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hang Bok; Jeong, Chang Joon; Roh, Gyu Hong; Rhee, Bo Wook; Park, Jee Won

    2003-12-01

    Since 1992, the direct use of spent pressurized water reactor fuel in CANada Deuterium Uranium (CANDU) reactors (DUPIC) has been studied as an alternative to the once-through fuel cycle. The DUPIC fuel cycle study is focused on the technical feasibility analysis, the fabrication of DUPIC fuels for irradiation tests and the demonstration of the DUPIC fuel performance. The feasibility analysis was conducted for the compatibility of the DUPIC fuel with existing CANDU-6 reactors from the viewpoints of reactor physics, reactor safety, fuel cycle economics, etc. This study has summarized the intermediate results of the DUPIC fuel compatibility analysis, which includes the CANDU reactor physics design requirements, DUPIC fuel core physics design method, performance of the DUPIC fuel core, regional overpower trip setpoint, and the CANDU primary shielding. The physics analysis showed that the CANDU-6 reactor can accommodate the DUPIC fuel without deteriorating the physics design requirements by adjusting the fuel management scheme if the fissile content of the DUPIC fuel is tightly controlled.

  1. Progressive-Ratio Schedules and Applied Behavior Analysis

    Science.gov (United States)

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  2. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  3. Reliability analysis of large scaled structures by optimization technique

    International Nuclear Information System (INIS)

    Ishikawa, N.; Mihara, T.; Iizuka, M.

    1987-01-01

    This paper presents a reliability analysis based on the optimization technique using PNET (Probabilistic Network Evaluation Technique) method for the highly redundant structures having a large number of collapse modes. This approach makes the best use of the merit of the optimization technique in which the idea of PNET method is used. The analytical process involves the minimization of safety index of the representative mode, subjected to satisfaction of the mechanism condition and of the positive external work. The procedure entails the sequential performance of a series of the NLP (Nonlinear Programming) problems, where the correlation condition as the idea of PNET method pertaining to the representative mode is taken as an additional constraint to the next analysis. Upon succeeding iterations, the final analysis is achieved when a collapse probability at the subsequent mode is extremely less than the value at the 1st mode. The approximate collapse probability of the structure is defined as the sum of the collapse probabilities of the representative modes classified by the extent of correlation. Then, in order to confirm the validity of the proposed method, the conventional Monte Carlo simulation is also revised by using the collapse load analysis. Finally, two fairly large structures were analyzed to illustrate the scope and application of the approach. (orig./HP)

  4. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D N; Prawer, S; Gonon, P; Walker, R; Dooley, S; Bettiol, A; Pearce, J [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1997-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  5. Development of fault diagnostic technique using reactor noise analysis

    International Nuclear Information System (INIS)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B.

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  6. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    International Nuclear Information System (INIS)

    William S. Charlton

    1999-01-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels

  7. Nuclear techniques of analysis in diamond synthesis and annealing

    International Nuclear Information System (INIS)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J.

    1996-01-01

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs

  8. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  9. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  10. New trends in sample preparation techniques for environmental analysis.

    Science.gov (United States)

    Ribeiro, Cláudia; Ribeiro, Ana Rita; Maia, Alexandra S; Gonçalves, Virgínia M F; Tiritan, Maria Elizabeth

    2014-01-01

    Environmental samples include a wide variety of complex matrices, with low concentrations of analytes and presence of several interferences. Sample preparation is a critical step and the main source of uncertainties in the analysis of environmental samples, and it is usually laborious, high cost, time consuming, and polluting. In this context, there is increasing interest in developing faster, cost-effective, and environmentally friendly sample preparation techniques. Recently, new methods have been developed and optimized in order to miniaturize extraction steps, to reduce solvent consumption or become solventless, and to automate systems. This review attempts to present an overview of the fundamentals, procedure, and application of the most recently developed sample preparation techniques for the extraction, cleanup, and concentration of organic pollutants from environmental samples. These techniques include: solid phase microextraction, on-line solid phase extraction, microextraction by packed sorbent, dispersive liquid-liquid microextraction, and QuEChERS (Quick, Easy, Cheap, Effective, Rugged and Safe).

  11. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  12. Novel technique for coal pyrolysis and hydrogenation production analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.

    1990-01-01

    The overall objective of this study is to establish vacuum ultraviolet photoionization-MS and VUV pulsed EI-MS as useful tools for a simpler and more accurate direct mass spectrometric measurement of a broad range of hydrocarbon compounds in complex mixtures for ultimate application to the study of the kinetics of coal hydrogenation and pyrolysis processes. The VUV-MS technique allows ionization of a broad range of species with minimal fragmentation. Many compounds of interest can be detected with the 118 nm wavelength, but additional compound selectivity is achievable by tuning the wavelength of the photo-ionization source in the VUV. Resonant four wave mixing techniques in Hg vapor will allow near continuous tuning from about 126 to 106 nm. This technique would facilitate the scientific investigation of coal upgrading processes such as pyrolysis and hydrogenation by allowing accurate direct analysis of both stable and intermediate reaction products.

  13. [Progress on Determination and Analysis of Zopiclone in Biological Samples].

    Science.gov (United States)

    Shu, C X; Gong, D; Zhang, L P; Zhao, J X

    2017-12-01

    As a new hypnotic, zopiclone is widely used in clinical treatment. There are many methods for determination of zopiclone, including spectrophotometry, chromatography and chromatography mass spectrum, etc. Present paper reviews different kinds of biological samples associated with zopiclone, extraction and purification methods, and determination and analysis methods, which aims to provide references for the relevant research and practice. Copyright© by the Editorial Department of Journal of Forensic Medicine.

  14. PROGRESS IN SIFT-MS: BREATH ANALYSIS AND OTHER APPLICATIONS

    Czech Academy of Sciences Publication Activity Database

    Španěl, Patrik; Smith, D.

    2011-01-01

    Roč. 30, č. 2 (2011), s. 236-267 ISSN 0277-7037 R&D Projects: GA MPO FT-TA4/124; GA ČR GA202/09/0800; GA ČR GA203/09/0256 Institutional research plan: CEZ:AV0Z40400503 Keywords : SIFT-MS * breath analysis * ion flow tube Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 10.461, year: 2011

  15. Small area analysis using micro-diffraction techniques

    International Nuclear Information System (INIS)

    Goehner, Raymond P.; Tissot, Ralph G. Jr.; Michael, Joseph R.

    2000-01-01

    An overall trend toward smaller electronic packages and devices makes it increasingly important and difficult to obtain meaningful diffraction information from small areas. X-ray micro-diffraction, electron back-scattered diffraction (EBSD) and Kossel are micro-diffraction techniques used for crystallographic analysis including texture, phase identification and strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements. X-ray micro-diffraction primarily is used for phase analysis and residual strain measurements of areas between 10 microm to 100 microm. For areas this small glass capillary optics are used for producing a usable collimated x-ray beam. These optics are designed to reflect x-rays below the critical angle therefore allowing for larger solid acceptance angle at the x-ray source resulting in brighter smaller x-ray beams. The determination of residual strain using micro-diffraction techniques is very important to the semiconductor industry. Residual stresses have caused voiding of the interconnect metal which then destroys electrical continuity. Being able to determine the residual stress helps industry to predict failures from the aging effects of interconnects due to this stress voiding. Stress measurements would be impossible using a conventional x-ray diffractometer; however, utilizing a 30 microm glass capillary these small areas are readily assessable for analysis. Kossel produces a wide angle diffraction pattern from fluorescent x-rays generated in the sample by an e-beam in a SEM. This technique can yield very precise lattice parameters for determining strain. Fig. 2 shows a Kossel pattern from a Ni specimen. Phase analysis on small areas is also possible using an energy dispersive spectrometer (EBSD) and x-ray micro-diffraction techniques. EBSD has the advantage of allowing the user to observe the area of interest using the excellent imaging capabilities of the SEM. An EDS detector has been

  16. ENEA initiatives in Southern Italy: Progress report, analysis, prospects

    International Nuclear Information System (INIS)

    Santandrea, E.

    1991-01-01

    In the past, technological development in Italy was concentrated in the country's heavily industrialized northern regions. The motive for this choice was the conception that to be successful in a highly competitive market, research investment had necessarily to favour those developed areas with an already proven capacity for guaranteed fast and high returns. Unfortunately this policy has created a technologically and economically depressed area, known as Mezzogiorno, in southern Italy. Within the framework of new national energy and economic policies calling for balanced economic and technological development, ENEA (Italian Commission for New Technologies, Energy and the Environment) has been entrusted with the planning and managing of research, commercialization and technology transfer programs designed to stimulate high-technology industrial activity in Italy's southern regions so as to allow them to become more competitive in the upcoming European free trade market. Small business concerns shall be favoured in this new development scheme which shall respect the existing local social-economic framework. Emphasis shall be placed on privileging such elements as quality, flexibility and versatility, as opposed to lost cost mass production. Priority is to be given to the development of renewable energy sources, energy conservation techniques and environmentally compatible technologies

  17. Modular techniques for dynamic fault-tree analysis

    Science.gov (United States)

    Patterson-Hine, F. A.; Dugan, Joanne B.

    1992-01-01

    It is noted that current approaches used to assess the dependability of complex systems such as Space Station Freedom and the Air Traffic Control System are incapable of handling the size and complexity of these highly integrated designs. A novel technique for modeling such systems which is built upon current techniques in Markov theory and combinatorial analysis is described. It enables the development of a hierarchical representation of system behavior which is more flexible than either technique alone. A solution strategy which is based on an object-oriented approach to model representation and evaluation is discussed. The technique is virtually transparent to the user since the fault tree models can be built graphically and the objects defined automatically. The tree modularization procedure allows the two model types, Markov and combinatoric, to coexist and does not require that the entire fault tree be translated to a Markov chain for evaluation. This effectively reduces the size of the Markov chain required and enables solutions with less truncation, making analysis of longer mission times possible. Using the fault-tolerant parallel processor as an example, a model is built and solved for a specific mission scenario and the solution approach is illustrated in detail.

  18. A review of residual stress analysis using thermoelastic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S [University of Southampton, School of Engineering Sciences, Highfield, Southampton, SO17 1BJ (United Kingdom); Burguete, R L [Airbus UK Ltd., New Filton House, Filton, Bristol, BS99 7AR (United Kingdom)

    2009-08-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  19. A review of residual stress analysis using thermoelastic techniques

    International Nuclear Information System (INIS)

    Robinson, A F; Dulieu-Barton, J M; Quinn, S; Burguete, R L

    2009-01-01

    Thermoelastic Stress Analysis (TSA) is a full-field technique for experimental stress analysis that is based on infra-red thermography. The technique has proved to be extremely effective for studying elastic stress fields and is now well established. It is based on the measurement of the temperature change that occurs as a result of a stress change. As residual stress is essentially a mean stress it is accepted that the linear form of the TSA relationship cannot be used to evaluate residual stresses. However, there are situations where this linear relationship is not valid or departures in material properties due to manufacturing procedures have enabled evaluations of residual stresses. The purpose of this paper is to review the current status of using a TSA based approach for the evaluation of residual stresses and to provide some examples of where promising results have been obtained.

  20. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  1. A BWR 24-month cycle analysis using multicycle techniques

    International Nuclear Information System (INIS)

    Hartley, K.D.

    1993-01-01

    Boiling water reactor (BWR) fuel cycle design analyses have become increasingly challenging in the past several years. As utilities continue to seek improved capacity factors, reduced power generation costs, and reduced outage costs, longer cycle lengths and fuel design optimization become important considerations. Accurate multicycle analysis techniques are necessary to determine the viability of fuel designs and cycle operating strategies to meet reactor operating requirements, e.g., meet thermal and reactivity margin constraints, while minimizing overall fuel cycle costs. Siemens Power Corporation (SPC), Nuclear Division, has successfully employed multi-cycle analysis techniques with realistic rodded cycle depletions to demonstrate equilibrium fuel cycle performance in 24-month cycles. Analyses have been performed by a BWR/5 reactor, at both rated and uprated power conditions

  2. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D D; Bailey, G; Martin, J; Garton, D; Noorman, H; Stelcer, E; Johnson, P [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1994-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  3. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  4. Analysis of Cell Phone Usage Using Correlation Techniques

    OpenAIRE

    T S R MURTHY; D. SIVA RAMA KRISHNA

    2011-01-01

    The present paper is a sample survey analysis, examined based on correlation techniques. The usage ofmobile phones is clearly almost un-avoidable these days and as such the authors have made a systematicsurvey through a well prepared questionnaire on making use of mobile phones to the maximum extent.These samples are various economical groups across a population of over one-lakh people. The resultsare scientifically categorized and interpreted to match the ground reality.

  5. Progress of the DUPIC Fuel Compatibility Analysis (IV) - Fuel Performance

    International Nuclear Information System (INIS)

    Choi, Hang Bok; Ryu, Ho Jin; Roh, Gyu Hong; Jeong, Chang Joon; Park, Chang Je; Song, Kee Chan; Lee, Jung Won

    2005-10-01

    This study describes the mechanical compatibility of the direct use of spent pressurized water reactor (PWR) fuel in Canada deuterium uranium (CANDU) reactors (DUPIC) fuel, when it is loaded into a CANDU reactor. The mechanical compatibility can be assessed for the fuel management, primary heat transport system, fuel channel, and the fuel handling system in the reactor core by both the experimental and analytic methods. Because the physical dimensions of the DUPIC fuel bundle adopt the CANDU flexible (CANFLEX) fuel bundle design which has already been demonstrated for a commercial use in CANDU reactors, the experimental compatibility analyses focused on the generation of material property data and the irradiation tests of the DUPIC fuel, which are used for the computational analysis. The intermediate results of the mechanical compatibility analysis have shown that the integrity of the DUPIC fuel is mostly maintained under the high power and high burnup conditions even though some material properties like the thermal conductivity is a little lower compared to the uranium fuel. However it is required to slightly change the current DUPIC fuel design to accommodate the high internal pressure of the fuel element. It is also strongly recommended to perform more irradiation tests of the DUPIC fuel to accumulate a database for the demonstration of the DUPIC fuel performance in the CANDU reactor

  6. Progress of the DUPIC Fuel Compatibility Analysis (IV) - Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Hang Bok; Ryu, Ho Jin; Roh, Gyu Hong; Jeong, Chang Joon; Park, Chang Je; Song, Kee Chan; Lee, Jung Won

    2005-10-15

    This study describes the mechanical compatibility of the direct use of spent pressurized water reactor (PWR) fuel in Canada deuterium uranium (CANDU) reactors (DUPIC) fuel, when it is loaded into a CANDU reactor. The mechanical compatibility can be assessed for the fuel management, primary heat transport system, fuel channel, and the fuel handling system in the reactor core by both the experimental and analytic methods. Because the physical dimensions of the DUPIC fuel bundle adopt the CANDU flexible (CANFLEX) fuel bundle design which has already been demonstrated for a commercial use in CANDU reactors, the experimental compatibility analyses focused on the generation of material property data and the irradiation tests of the DUPIC fuel, which are used for the computational analysis. The intermediate results of the mechanical compatibility analysis have shown that the integrity of the DUPIC fuel is mostly maintained under the high power and high burnup conditions even though some material properties like the thermal conductivity is a little lower compared to the uranium fuel. However it is required to slightly change the current DUPIC fuel design to accommodate the high internal pressure of the fuel element. It is also strongly recommended to perform more irradiation tests of the DUPIC fuel to accumulate a database for the demonstration of the DUPIC fuel performance in the CANDU reactor.

  7. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  8. FDTD technique based crosstalk analysis of bundled SWCNT interconnects

    International Nuclear Information System (INIS)

    Duksh, Yograj Singh; Kaushik, Brajesh Kumar; Agarwal, Rajendra P.

    2015-01-01

    The equivalent electrical circuit model of a bundled single-walled carbon nanotube based distributed RLC interconnects is employed for the crosstalk analysis. The accurate time domain analysis and crosstalk effect in the VLSI interconnect has emerged as an essential design criteria. This paper presents a brief description of the numerical method based finite difference time domain (FDTD) technique that is intended for estimation of voltages and currents on coupled transmission lines. For the FDTD implementation, the stability of the proposed model is strictly restricted by the Courant condition. This method is used for the estimation of crosstalk induced propagation delay and peak voltage in lossy RLC interconnects. Both functional and dynamic crosstalk effects are analyzed in the coupled transmission line. The effect of line resistance on crosstalk induced delay, and peak voltage under dynamic and functional crosstalk is also evaluated. The FDTD analysis and the SPICE simulations are carried out at 32 nm technology node for the global interconnects. It is observed that the analytical results obtained using the FDTD technique are in good agreement with the SPICE simulation results. The crosstalk induced delay, propagation delay, and peak voltage obtained using the FDTD technique shows average errors of 4.9%, 3.4% and 0.46%, respectively, in comparison to SPICE. (paper)

  9. Characterization of decommissioned reactor internals: Monte Carlo analysis technique

    International Nuclear Information System (INIS)

    Reid, B.D.; Love, E.F.; Luksic, A.T.

    1993-03-01

    This study discusses computer analysis techniques for determining activation levels of irradiated reactor component hardware to yield data for the Department of Energy's Greater-Than-Class C Low-Level Radioactive Waste Program. The study recommends the Monte Carlo Neutron/Photon (MCNP) computer code as the best analysis tool for this application and compares the technique to direct sampling methodology. To implement the MCNP analysis, a computer model would be developed to reflect the geometry, material composition, and power history of an existing shutdown reactor. MCNP analysis would then be performed using the computer model, and the results would be validated by comparison to laboratory analysis results from samples taken from the shutdown reactor. The report estimates uncertainties for each step of the computational and laboratory analyses; the overall uncertainty of the MCNP results is projected to be ±35%. The primary source of uncertainty is identified as the material composition of the components, and research is suggested to address that uncertainty

  10. Progress on Radiochemical Analysis for Nuclear Waste Management in Decommissioning

    DEFF Research Database (Denmark)

    Hou, Xiaolin; Qiao, Jixin; Shi, Keliang

    With the increaed numbers of nuclear facilities have been closed and are being or are going to be decommissioned, it is required to characterise the produced nuclear waste for its treatment by identification of the radionuclides and qualitatively determine them. Of the radionuclides related...... separation of radionuclides. In order to improve and maintain the Nodic competence in analysis of radionculides in waste samples, a NKS B project on this topic was launched in 2009. During the first phase of the NKS-B RadWaste project (2009-2010), a good achivement has been reached on establishment...... of collaboration, identifing the requirements from the Nordic nuclear industries and optimizing and development of some analytical methods (Hou et al. NKS-222, 2010). In the year 2011, this project (NKS-B RadWaste2011) continued. The major achievements of this project in 2011 include: (1) development of a method...

  11. Research progress and hotspot analysis of spatial interpolation

    Science.gov (United States)

    Jia, Li-juan; Zheng, Xin-qi; Miao, Jin-li

    2018-02-01

    In this paper, the literatures related to spatial interpolation between 1982 and 2017, which are included in the Web of Science core database, are used as data sources, and the visualization analysis is carried out according to the co-country network, co-category network, co-citation network, keywords co-occurrence network. It is found that spatial interpolation has experienced three stages: slow development, steady development and rapid development; The cross effect between 11 clustering groups, the main convergence of spatial interpolation theory research, the practical application and case study of spatial interpolation and research on the accuracy and efficiency of spatial interpolation. Finding the optimal spatial interpolation is the frontier and hot spot of the research. Spatial interpolation research has formed a theoretical basis and research system framework, interdisciplinary strong, is widely used in various fields.

  12. Occupational exposure to HDI: progress and challenges in biomarker analysis.

    Science.gov (United States)

    Flack, Sheila L; Ball, Louise M; Nylander-French, Leena A

    2010-10-01

    1,6-Hexamethylene diisocyanate (HDI) is extensively used in the automotive repair industry and is a commonly reported cause of occupational asthma in industrialized populations. However, the exact pathological mechanism remains uncertain. Characterization and quantification of biomarkers resulting from HDI exposure can fill important knowledge gaps between exposure, susceptibility, and the rise of immunological reactions and sensitization leading to asthma. Here, we discuss existing challenges in HDI biomarker analysis including the quantification of N-acetyl-1,6-hexamethylene diamine (monoacetyl-HDA) and N,N'-diacetyl-1,6-hexamethylene diamine (diacetyl-HDA) in urine samples based on previously established methods for HDA analysis. In addition, we describe the optimization of reaction conditions for the synthesis of monoacetyl-HDA and diacetyl-HDA, and utilize these standards for the quantification of these metabolites in the urine of three occupationally exposed workers. Diacetyl-HDA was present in untreated urine at 0.015-0.060 μg/l. Using base hydrolysis, the concentration range of monoacetyl-HDA in urine was 0.19-2.2 μg/l, 60-fold higher than in the untreated samples on average. HDA was detected only in one sample after base hydrolysis (0.026 μg/l). In contrast, acid hydrolysis yielded HDA concentrations ranging from 0.36 to 10.1 μg/l in these three samples. These findings demonstrate HDI metabolism via N-acetylation metabolic pathway and protein adduct formation resulting from occupational exposure to HDI. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Progress and challenges in the development and qualification of multi-level multi-physics coupled methodologies for reactor analysis

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.

    2007-01-01

    Current trends in nuclear power generation and regulation as well as the design of next generation reactor concepts along with the continuing computer technology progress stimulate the development, qualification and application of multi-physics multi-scale coupled code systems. The efforts have been focused on extending the analysis capabilities by coupling models, which simulate different phenomena or system components, as well as on refining the scale and level of detail of the coupling. This paper reviews the progress made in this area and outlines the remaining challenges. The discussion is illustrated with examples based on neutronics/thermohydraulics coupling in the reactor core modeling. In both fields recent advances and developments are towards more physics-based high-fidelity simulations, which require implementation of improved and flexible coupling methodologies. First, the progresses in coupling of different physics codes along with the advances in multi-level techniques for coupled code simulations are discussed. Second, the issues related to the consistent qualification of coupled multi-physics and multi-scale code systems for design and safety evaluation are presented. The increased importance of uncertainty and sensitivity analysis are discussed along with approaches to propagate the uncertainty quantification between the codes. The incoming OECD LWR Uncertainty Analysis in Modeling (UAM) benchmark is the first international activity to address this issue and it is described in the paper. Finally, the remaining challenges with multi-physics coupling are outlined. (authors)

  14. Progress and challenges in the development and qualification of multi-level multi-physics coupled methodologies for reactor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ivanov, K.; Avramova, M. [Pennsylvania State Univ., University Park, PA (United States)

    2007-07-01

    Current trends in nuclear power generation and regulation as well as the design of next generation reactor concepts along with the continuing computer technology progress stimulate the development, qualification and application of multi-physics multi-scale coupled code systems. The efforts have been focused on extending the analysis capabilities by coupling models, which simulate different phenomena or system components, as well as on refining the scale and level of detail of the coupling. This paper reviews the progress made in this area and outlines the remaining challenges. The discussion is illustrated with examples based on neutronics/thermohydraulics coupling in the reactor core modeling. In both fields recent advances and developments are towards more physics-based high-fidelity simulations, which require implementation of improved and flexible coupling methodologies. First, the progresses in coupling of different physics codes along with the advances in multi-level techniques for coupled code simulations are discussed. Second, the issues related to the consistent qualification of coupled multi-physics and multi-scale code systems for design and safety evaluation are presented. The increased importance of uncertainty and sensitivity analysis are discussed along with approaches to propagate the uncertainty quantification between the codes. The incoming OECD LWR Uncertainty Analysis in Modeling (UAM) benchmark is the first international activity to address this issue and it is described in the paper. Finally, the remaining challenges with multi-physics coupling are outlined. (authors)

  15. Evaluation of Visual Field Progression in Glaucoma: Quasar Regression Program and Event Analysis.

    Science.gov (United States)

    Díaz-Alemán, Valentín T; González-Hernández, Marta; Perera-Sanz, Daniel; Armas-Domínguez, Karintia

    2016-01-01

    To determine the sensitivity, specificity and agreement between the Quasar program, glaucoma progression analysis (GPA II) event analysis and expert opinion in the detection of glaucomatous progression. The Quasar program is based on linear regression analysis of both mean defect (MD) and pattern standard deviation (PSD). Each series of visual fields was evaluated by three methods; Quasar, GPA II and four experts. The sensitivity, specificity and agreement (kappa) for each method was calculated, using expert opinion as the reference standard. The study included 439 SITA Standard visual fields of 56 eyes of 42 patients, with a mean of 7.8 ± 0.8 visual fields per eye. When suspected cases of progression were considered stable, sensitivity and specificity of Quasar, GPA II and the experts were 86.6% and 70.7%, 26.6% and 95.1%, and 86.6% and 92.6% respectively. When suspected cases of progression were considered as progressing, sensitivity and specificity of Quasar, GPA II and the experts were 79.1% and 81.2%, 45.8% and 90.6%, and 85.4% and 90.6% respectively. The agreement between Quasar and GPA II when suspected cases were considered stable or progressing was 0.03 and 0.28 respectively. The degree of agreement between Quasar and the experts when suspected cases were considered stable or progressing was 0.472 and 0.507. The degree of agreement between GPA II and the experts when suspected cases were considered stable or progressing was 0.262 and 0.342. The combination of MD and PSD regression analysis in the Quasar program showed better agreement with the experts and higher sensitivity than GPA II.

  16. The analysis of composite laminated beams using a 2D interpolating meshless technique

    Science.gov (United States)

    Sadek, S. H. M.; Belinha, J.; Parente, M. P. L.; Natal Jorge, R. M.; de Sá, J. M. A. César; Ferreira, A. J. M.

    2018-02-01

    Laminated composite materials are widely implemented in several engineering constructions. For its relative light weight, these materials are suitable for aerospace, military, marine, and automotive structural applications. To obtain safe and economical structures, the modelling analysis accuracy is highly relevant. Since meshless methods in the recent years achieved a remarkable progress in computational mechanics, the present work uses one of the most flexible and stable interpolation meshless technique available in the literature—the Radial Point Interpolation Method (RPIM). Here, a 2D approach is considered to numerically analyse composite laminated beams. Both the meshless formulation and the equilibrium equations ruling the studied physical phenomenon are presented with detail. Several benchmark beam examples are studied and the results are compared with exact solutions available in the literature and the results obtained from a commercial finite element software. The results show the efficiency and accuracy of the proposed numeric technique.

  17. Investigation of energy management strategies for photovoltaic systems - An analysis technique

    Science.gov (United States)

    Cull, R. C.; Eltimsahy, A. H.

    1982-01-01

    Progress is reported in formulating energy management strategies for stand-alone PV systems, developing an analytical tool that can be used to investigate these strategies, applying this tool to determine the proper control algorithms and control variables (controller inputs and outputs) for a range of applications, and quantifying the relative performance and economics when compared to systems that do not apply energy management. The analysis technique developed may be broadly applied to a variety of systems to determine the most appropriate energy management strategies, control variables and algorithms. The only inputs required are statistical distributions for stochastic energy inputs and outputs of the system and the system's device characteristics (efficiency and ratings). Although the formulation was originally driven by stand-alone PV system needs, the techniques are also applicable to hybrid and grid connected systems.

  18. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Lubna Moin

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  19. Recent progress in the analysis of iced airfoils and wings

    Science.gov (United States)

    Cebeci, Tuncer; Chen, Hsun H.; Kaups, Kalle; Schimke, Sue

    1992-01-01

    Recent work on the analysis of iced airfoils and wings is described. Ice shapes for multielement airfoils and wings are computed using an extension of the LEWICE code that was developed for single airfoils. The aerodynamic properties of the iced wing are determined with an interactive scheme in which the solutions of the inviscid flow equations are obtained from a panel method and the solutions of the viscous flow equations are obtained from an inverse three-dimensional finite-difference boundary-layer method. A new interaction law is used to couple the inviscid and viscous flow solutions. The newly developed LEWICE multielement code is amplified to a high-lift configuration to calculate the ice shapes on the slat and on the main airfoil and on a four-element airfoil. The application of the LEWICE wing code to the calculation of ice shapes on a MS-317 swept wing shows good agreement with measurements. The interactive boundary-layer method is applied to a tapered iced wing in order to study the effect of icing on the aerodynamic properties of the wing at several angles of attack.

  20. Protein Analysis in Human Cerebrospinal Fluid: Physiological Aspects, Current Progress and Future Challenges

    Directory of Open Access Journals (Sweden)

    Andreas F. Hühmer

    2006-01-01

    Full Text Available The introduction of lumbar puncture into clinical medicine over 100 years ago marks the beginning of the study of central nervous system diseases using the human cerebrospinal fluid (CSF. Ever since, CSF has been analyzed extensively to elucidate the physiological and biochemical bases of neurological disease. The proximity of CSF to the brain makes it a good target for studying the pathophysiology of brain functions, but the barrier function of the CSF also impedes its diagnostic value. Today, measurements to determine alterations in the composition of CSF are central in the differential diagnosis of specific diseases of the central nervous system (CNS. In particular, the analysis of the CSF protein composition provides crucial information in the diagnosis of CNS diseases. This enables the assessment of the physiology of the blood-CSF barrier and of the immunology of intrathecial responses. Besides those routine measurements, protein compositional studies of CSF have been extended recently to many other proteins in the expectation that comprehensive analysis of lower abundance CSF proteins will lead to the discovery of new disease markers. Disease marker discovery by molecular profiling of the CSF tissue has the enormous potential of providing many new disease relevant molecules. New developments in protein profiling techniques hold promise for the discovery and validation of relevant disease markers. In this review, we summarize the current efforts and progress in CSF protein profiling measurements using conventional and current protein analysis tools. We also discuss necessary development in methodology in order to have the highest impact on the study of the molecular composition of CSF proteins.

  1. Different techniques of multispectral data analysis for vegetation fraction retrieval

    Science.gov (United States)

    Kancheva, Rumiana; Georgiev, Georgi

    2012-07-01

    Vegetation monitoring is one of the most important applications of remote sensing technologies. In respect to farmlands, the assessment of crop condition constitutes the basis of growth, development, and yield processes monitoring. Plant condition is defined by a set of biometric variables, such as density, height, biomass amount, leaf area index, and etc. The canopy cover fraction is closely related to these variables, and is state-indicative of the growth process. At the same time it is a defining factor of the soil-vegetation system spectral signatures. That is why spectral mixtures decomposition is a primary objective in remotely sensed data processing and interpretation, specifically in agricultural applications. The actual usefulness of the applied methods depends on their prediction reliability. The goal of this paper is to present and compare different techniques for quantitative endmember extraction from soil-crop patterns reflectance. These techniques include: linear spectral unmixing, two-dimensional spectra analysis, spectral ratio analysis (vegetation indices), spectral derivative analysis (red edge position), colorimetric analysis (tristimulus values sum, chromaticity coordinates and dominant wavelength). The objective is to reveal their potential, accuracy and robustness for plant fraction estimation from multispectral data. Regression relationships have been established between crop canopy cover and various spectral estimators.

  2. 1985. Annual progress report

    International Nuclear Information System (INIS)

    1986-01-01

    This annual progress report of the CEA Protection and Nuclear Safety Institut outlines a description of the progress made in each sections of the Institut Research activities of the different departments include: reactor safety analysis, fuel cycle facilities analysis; and associated safety research programs (criticality, sites, transport ...), radioecology and environmental radioprotection techniques; data acquisition on radioactive waste storage sites; radiation effects on man, studies on radioprotection techniques; nuclear material security including security of facilities, security of nuclear material transport, and monitoring of nuclear material management; nuclear facility decommissioning; and finally the public information [fr

  3. Gas chromatographic isolation technique for compound-specific radiocarbon analysis

    International Nuclear Information System (INIS)

    Uchida, M.; Kumamoto, Y.; Shibata, Y.; Yoneda, M.; Morita, M.; Kawamura, K.

    2002-01-01

    Full text: We present here a gas chromatographic isolation technique for the compound-specific radiocarbon analysis of biomarkers from the marine sediments. The biomarkers of fatty acids, hydrocarbon and sterols were isolated with enough amount for radiocarbon analysis using a preparative capillary gas chromatograph (PCGC) system. The PCGC systems used here is composed of an HP 6890 GC with FID, a cooled injection system (CIS, Gerstel, Germany), a zero-dead-volume effluent splitter, and a cryogenic preparative collection device (PFC, Gerstel). For AMS analysis, we need to separate and recover sufficient quantity of target individual compounds (>50 μgC). Yields of target compounds from C 14 n-alkanes to C 40 to C 30 n-alkanes and approximately that of 80% for higher molecular weights compounds more than C 30 n-alkanes. Compound specific radiocarbon analysis of organic compounds, as well as compound-specific stable isotope analysis, provide valuable information on the origins and carbon cycling in marine system. Above PCGC conditions, we applied compound-specific radiocarbon analysis to the marine sediments from western north Pacific, which showed the possibility of a useful chronology tool for estimating the age of sediment using organic matter in paleoceanographic study, in the area where enough amounts of planktonic foraminifera for radiocarbon analysis by accelerator mass spectrometry (AMS) are difficult to obtain due to dissolution of calcium carbonate. (author)

  4. Image Analysis Technique for Material Behavior Evaluation in Civil Structures

    Science.gov (United States)

    Moretti, Michele; Rossi, Gianluca

    2017-01-01

    The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129

  5. Fault tree technique: advances in probabilistic and logical analysis

    International Nuclear Information System (INIS)

    Clarotti, C.A.; Amendola, A.; Contini, S.; Squellati, G.

    1982-01-01

    Fault tree reliability analysis is used for assessing the risk associated to systems of increasing complexity (phased mission systems, systems with multistate components, systems with non-monotonic structure functions). Much care must be taken to make sure that fault tree technique is not used beyond its correct validity range. To this end a critical review of mathematical foundations of reliability fault tree analysis is carried out. Limitations are enlightened and potential solutions to open problems are suggested. Moreover an overview is given on the most recent developments in the implementation of an integrated software (SALP-MP, SALP-NOT, SALP-CAFT Codes) for the analysis of a wide class of systems

  6. Temperature analysis of laser ignited metalized material using spectroscopic technique

    Science.gov (United States)

    Bassi, Ishaan; Sharma, Pallavi; Daipuriya, Ritu; Singh, Manpreet

    2018-05-01

    The temperature measurement of the laser ignited aluminized Nano energetic mixture using spectroscopy has a great scope in in analysing the material characteristic and combustion analysis. The spectroscopic analysis helps to do in depth study of combustion of materials which is difficult to do using standard pyrometric methods. Laser ignition was used because it consumes less energy as compared to electric ignition but ignited material dissipate the same energy as dissipated by electric ignition and also with the same impact. Here, the presented research is primarily focused on the temperature analysis of energetic material which comprises of explosive material mixed with nano-material and is ignited with the help of laser. Spectroscopy technique is used here to estimate the temperature during the ignition process. The Nano energetic mixture used in the research does not comprise of any material that is sensitive to high impact.

  7. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... the methods in two: Those who assume independence between the variables and thus use a diagonal estimate of the within-class covariance matrix, and those who assume dependence between the variables and thus use an estimate of the within-class covariance matrix, which also estimates the correlations between...... variables. The two groups of methods are compared and the pros and cons are exemplied using dierent cases of simulated data. The results illustrate that the estimate of the covariance matrix is an important factor with respect to choice of method, and the choice of method should thus be driven by the nature...

  8. Some problems of calibration technique in charged particle activation analysis

    International Nuclear Information System (INIS)

    Krasnov, N.N.; Zatolokin, B.V.; Konstantinov, I.O.

    1977-01-01

    It is shown that three different approaches to calibration technique based on the use of average cross-section, equivalent target thickness and thick target yield are adequate. Using the concept of thick target yield, a convenient charged particle activation equation is obtained. The possibility of simultaneous determination of two impurities, from which the same isotope is formed, is pointed out. The use of the concept of thick target yield facilitates the derivation of a simple formula for an absolute and comparative methods of analysis. The methodical error does not exceed 10%. Calibration technique and determination of expected sensitivity based on the thick target yield concept is also very convenient because experimental determination of thick target yield values is a much simpler procedure than getting activation curve or excitation function. (T.G.)

  9. Ion beam analysis and spectrometry techniques for Cultural Heritage studies

    International Nuclear Information System (INIS)

    Beck, L.

    2013-01-01

    The implementation of experimental techniques for the characterisation of Cultural heritage materials has to take into account some requirements. The complexity of these past materials requires the development of new techniques of examination and analysis, or the transfer of technologies developed for the study of advanced materials. In addition, due to precious aspect of artwork it is also necessary to use the non-destructive methods, respecting the integrity of objects. It is for this reason that the methods using radiations and/or particles play a important role in the scientific study of art history and archaeology since their discovery. X-ray and γ-ray spectrometry as well as ion beam analysis (IBA) are analytical tools at the service of Cultural heritage. This report mainly presents experimental developments for IBA: PIXE, RBS/EBS and NRA. These developments were applied to the study of archaeological composite materials: layered materials or mixtures composed of organic and non-organic phases. Three examples are shown: evolution of silvering techniques for the production of counterfeit coinage during the Roman Empire and in the 16. century, the characterization of composites or mixed mineral/organic compounds such as bone and paint. In these last two cases, the combination of techniques gave original results on the proportion of both phases: apatite/collagen in bone, pigment/binder in paintings. Another part of this report is then dedicated to the non-invasive/non-destructive characterization of prehistoric pigments, in situ, for rock art studies in caves and in the laboratory. Finally, the perspectives of this work are presented. (author) [fr

  10. Development of flow injection analysis technique for uranium estimation

    International Nuclear Information System (INIS)

    Paranjape, A.H.; Pandit, S.S.; Shinde, S.S.; Ramanujam, A.; Dhumwad, R.K.

    1991-01-01

    Flow injection analysis is increasingly used as a process control analytical technique in many industries. It involves injection of the sample at a constant rate into a steady flowing stream of reagent and passing this mixture through a suitable detector. This paper describes the development of such a system for the analysis of uranium (VI) and (IV) and its gross gamma activity. It is amenable for on-line or automated off-line monitoring of uranium and its activity in process streams. The sample injection port is suitable for automated injection of radioactive samples. The performance of the system has been tested for the colorimetric response of U(VI) samples at 410 nm in the range of 35 to 360mg/ml in nitric acid medium using Metrohm 662 Photometer and a recorder as detector assembly. The precision of the method is found to be better than +/- 0.5%. This technique with certain modifications is used for the analysis of U(VI) in the range 0.1-3mg/ailq. by alcoholic thiocynate procedure within +/- 1.5% precision. Similarly the precision for the determination of U(IV) in the range 15-120 mg at 650 nm is found to be better than 5%. With NaI well-type detector in the flow line, the gross gamma counting of the solution under flow is found to be within a precision of +/- 5%. (author). 4 refs., 2 figs., 1 tab

  11. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  12. Symbolic manipulation techniques for vibration analysis of laminated elliptic plates

    Science.gov (United States)

    Andersen, C. M.; Noor, A. K.

    1977-01-01

    A computational scheme is presented for the free vibration analysis of laminated composite elliptic plates. The scheme is based on Hamilton's principle, the Rayleigh-Ritz technique and symmetry considerations and is implemented with the aid of the MACSYMA symbolic manipulation system. The MACYSMA system, through differentiation, integration, and simplification of analytic expressions, produces highly-efficient FORTRAN code for the evaluation of the stiffness and mass coefficients. Multiple use is made of this code to obtain not only the frequencies and mode shapes of the plate, but also the derivatives of the frequencies with respect to various material and geometric parameters.

  13. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  14. The application of radiotracer technique for preconcentration neutron activation analysis

    International Nuclear Information System (INIS)

    Wang Xiaolin; Chen Yinliang; Sun Ying; Fu Yibei

    1995-01-01

    The application of radiotracer technique for preconcentration neutron activation analysis (Pre-NAA) are studied and the method for determination of chemical yield of Pre-NAA is developed. This method has been applied to determination of gold, iridium and rhenium in steel and rock samples and the contents of noble metal are in the range of 1-20 ng·g -1 (sample). In addition, the accuracy difference caused by determination of chemical yield between RNAA and Pre-NAA are also discussed

  15. Nonactivation interaction techniques in the analysis of environmental samples

    International Nuclear Information System (INIS)

    Tolgyessy, J.

    1986-01-01

    Nonactivation interaction analytical methods are based on the interaction processes of nuclear and X-ray radiation with a sample, leading to their absorption and backscattering, to the ionization of gases or excitation of fluorescent X-ray by radiation, but not to the activation of determined elements. From the point of view of environmental analysis, the most useful nonactivation interaction techniques are X-ray fluorescence by photon or charged particle excitation, ionization of gases by nuclear radiation, elastic scattering of charged particles and backscattering of beta radiation. The significant advantage of these methods is that they are nondestructive. (author)

  16. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    International Nuclear Information System (INIS)

    Lindstrom, D.J.; Lindstrom, R.M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably

  17. Performance of confocal scanning laser tomograph Topographic Change Analysis (TCA) for assessing glaucomatous progression.

    Science.gov (United States)

    Bowd, Christopher; Balasubramanian, Madhusudhanan; Weinreb, Robert N; Vizzeri, Gianmarco; Alencar, Luciana M; O'Leary, Neil; Sample, Pamela A; Zangwill, Linda M

    2009-02-01

    To determine the sensitivity and specificity of confocal scanning laser ophthalmoscope's Topographic Change Analysis (TCA; Heidelberg Retina Tomograph [HRT]; Heidelberg Engineering, Heidelberg, Germany) parameters for discriminating between progressing glaucomatous and stable healthy eyes. The 0.90, 0.95, and 0.99 specificity cutoffs for various (n=70) TCA parameters were developed by using 1000 permuted topographic series derived from HRT images of 18 healthy eyes from Moorfields Eye Hospital, imaged at least four times. The cutoffs were then applied to topographic series from 36 eyes with known glaucomatous progression (by optic disc stereophotograph assessment and/or standard automated perimetry guided progression analysis, [GPA]) and 21 healthy eyes from the University of California, San Diego (UCSD) Diagnostic Innovations in Glaucoma Study (DIGS), all imaged at least four times, to determine TCA sensitivity and specificity. Cutoffs also were applied to 210 DIGS patients' eyes imaged at least four times with no evidence of progression (nonprogressed) by stereophotography or GPA. The TCA parameter providing the best sensitivity/specificity tradeoff using the 0.90, 0.95, and 0.99 cutoffs was the largest clustered superpixel area within the optic disc margin (CAREA(disc) mm(2)). Sensitivities/specificities for classifying progressing (by stereophotography and/or GPA) and healthy eyes were 0.778/0.809, 0.639/0.857, and 0.611/1.00, respectively. In nonprogressing eyes, specificities were 0.464, 0.570, and 0.647 (i.e., lower than in the healthy eyes). In addition, TCA parameter measurements of nonprogressing eyes were similar to those of progressing eyes. TCA parameters can discriminate between progressing and longitudinally observed healthy eyes. Low specificity in apparently nonprogressing patients' eyes suggests early progression detection using TCA.

  18. Longitudinal analysis of progression in glaucoma using spectral-domain optical coherence tomography.

    Science.gov (United States)

    Wessel, Julia M; Horn, Folkert K; Tornow, Ralf P; Schmid, Matthias; Mardin, Christian Y; Kruse, Friedrich E; Juenemann, Anselm G; Laemmer, Robert

    2013-05-01

    To compare the longitudinal loss of RNFL thickness measurements by SD-OCT in healthy individuals and glaucoma patients with or without progression concerning optic disc morphology. A total of 62 eyes, comprising 38 glaucomatous eyes with open angle glaucoma and 24 healthy controls, were included in the study (Erlangen Glaucoma Registry, NTC00494923). All patients were investigated annually over a period of 3 years by Spectralis SD-OCT measuring peripapillary RNFL thickness. By masked comparative analysis of photographs, the eyes were classified into nonprogressive and progressive glaucoma cases. Longitudinal loss of RNFL thickness was compared with morphological changes of optic disc morphology. Mixed model analysis of annual OCT scans revealed an estimated annual decrease of the RNFL thickness by 2.12 μm in glaucoma eyes with progression, whereas glaucoma eyes without progression in optic disc morphology lost 1.18 μm per year in RNFL thickness (P = 0.002). The rate of change in healthy eyes was 0.60 μm and thereby also significantly lower than in glaucoma eyes with progression (P < 0.001). The intrasession variability of three successive measurements without head repositioning was 1.5 ± 0.7 μm. The loss of mean RNFL thickness exceeded the intrasession variability in 60% of nonprogressive eyes, and in 85% of progressive eyes after 3 years. LONGITUDINAL MEASUREMENTS OF RNFL THICKNESS USING SD-OCT SHOW A MORE PRONOUNCED REDUCTION OF RNFL THICKNESS IN PATIENTS WITH PROGRESSION COMPARED WITH PATIENTS WITHOUT PROGRESSION IN GLAUCOMATOUS OPTIC DISC CHANGES. (www.clinicaltrials.gov number, NTC00494923.).

  19. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  20. Measuring caloric response: comparison of different analysis techniques.

    Science.gov (United States)

    Mallinson, A I; Longridge, N S; Pace-Asciak, P; Ngo, R

    2010-01-01

    Electronystagmography (ENG) testing has been supplanted by newer techniques of measuring eye movement with infrared cameras (VNG). Most techniques of quantifying caloric induced nystagmus measure the slow phase velocity in some manner. Although our analysis is carried out by very experienced assessors, some systems have computer algorithms that have been "taught" to locate and quantify maximum responses. We wondered what differences in measurement might show up when measuring calorics using different techniques and systems, the relevance of this being that if there was a change in slow phase velocity between ENG and VNG testing when measuring caloric response, then normative data would have to be changed. There are also some subjective but important aspects of ENG interpretation which comment on the nature of the response (e.g. responses which might be "sporadic" or "scant"). Our experiment compared caloric responses in 100 patients analyzed four different ways. Each caloric was analyzed by our old ENG system, our new VNG system, an inexperienced assessor and the computer algorithm, and data was compared. All four systems made similar measurements but our inexperienced assessor failed to recognize responses as sporadic or scant, and we feel this is a limitation to be kept in mind in the rural setting, as it is an important aspect of assessment in complex patients. Assessment of complex VNGs should be left to an experienced assessor.

  1. Accident progression event tree analysis for postulated severe accidents at N Reactor

    International Nuclear Information System (INIS)

    Wyss, G.D.; Camp, A.L.; Miller, L.A.; Dingman, S.E.; Kunsman, D.M.; Medford, G.T.

    1990-06-01

    A Level II/III probabilistic risk assessment (PRA) has been performed for N Reactor, a Department of Energy (DOE) production reactor located on the Hanford reservation in Washington. The accident progression analysis documented in this report determines how core damage accidents identified in the Level I PRA progress from fuel damage to confinement response and potential releases the environment. The objectives of the study are to generate accident progression data for the Level II/III PRA source term model and to identify changes that could improve plant response under accident conditions. The scope of the analysis is comprehensive, excluding only sabotage and operator errors of commission. State-of-the-art methodology is employed based largely on the methods developed by Sandia for the US Nuclear Regulatory Commission in support of the NUREG-1150 study. The accident progression model allows complex interactions and dependencies between systems to be explicitly considered. Latin Hypecube sampling was used to assess the phenomenological and systemic uncertainties associated with the primary and confinement system responses to the core damage accident. The results of the analysis show that the N Reactor confinement concept provides significant radiological protection for most of the accident progression pathways studied

  2. Mechanisms of subsidence for induced damage and techniques for analysis

    International Nuclear Information System (INIS)

    Drumm, E.C.; Bennett, R.M.; Kane, W.F.

    1988-01-01

    Structural damage due to mining induced subsidence is a function of the nature of the structure and its position on the subsidence profile. A point on the profile may be in the tensile zone, the compressive zone, or the no-deformation zone at the bottom of the profile. Damage to structures in the tension zone is primarily due to a reduction of support during vertical displacement of the ground surface, and to shear stresses between the soil and structure resulting from horizontal displacements. The damage mechanisms due to tension can be investigated effectively using a two-dimensional plane stress analysis. Structures in the compression zone are subjected to positive moments in the footing and large compressive horizontal stresses in the foundation walls. A plane strain analysis of the foundation wall is utilized to examine compression zone damage mechanisms. The structural aspects affecting each mechanism are identified and potential mitigation techniques are summarized

  3. Service Interaction Flow Analysis Technique for Service Personalization

    DEFF Research Database (Denmark)

    Korhonen, Olli; Kinnula, Marianne; Syrjanen, Anna-Liisa

    2017-01-01

    Service interaction flows are difficult to capture, analyze, outline, and represent for research and design purposes. We examine how variation of personalized service flows in technology-mediated service interaction can be modeled and analyzed to provide information on how service personalization...... could support interaction. We have analyzed service interaction cases in a context of technology-mediated car rental service. With the analysis technique we propose, inspired by Interaction Analysis method, we were able to capture and model the situational service interaction. Our contribution regarding...... technology-mediated service interaction design is twofold: First, with the increased understanding on the role of personalization in managing variation in technology-mediated service interaction, our study contributes to designing service management information systems and human-computer interfaces...

  4. Progress of a Cross-Correlation Based Optical Strain Measurement Technique for Detecting Radial Growth on a Rotating Disk

    Science.gov (United States)

    Clem, Michelle M.; Abdul-Aziz, Ali; Woike, Mark R.; Fralick, Gustave C.

    2015-01-01

    The modern turbine engine operates in a harsh environment at high speeds and is repeatedly exposed to combined high mechanical and thermal loads. The cumulative effects of these external forces lead to high stresses and strains on the engine components, such as the rotating turbine disks, which may eventually lead to a catastrophic failure if left undetected. The operating environment makes it difficult to use conventional strain gauges, therefore, non-contact strain measurement techniques is of interest to NASA and the turbine engine community. This presentation describes one such approach; the use of cross correlation analysis to measure strain experienced by the engine turbine disk with the goal of assessing potential faults and damage.

  5. Progression Analysis and Stage Discovery in Continuous Physiological Processes Using Image Computing

    Directory of Open Access Journals (Sweden)

    Ferrucci Luigi

    2010-01-01

    Full Text Available We propose an image computing-based method for quantitative analysis of continuous physiological processes that can be sensed by medical imaging and demonstrate its application to the analysis of morphological alterations of the bone structure, which correlate with the progression of osteoarthritis (OA. The purpose of the analysis is to quantitatively estimate OA progression in a fashion that can assist in understanding the pathophysiology of the disease. Ultimately, the texture analysis will be able to provide an alternative OA scoring method, which can potentially reflect the progression of the disease in a more direct fashion compared to the existing clinically utilized classification schemes based on radiology. This method can be useful not just for studying the nature of OA, but also for developing and testing the effect of drugs and treatments. While in this paper we demonstrate the application of the method to osteoarthritis, its generality makes it suitable for the analysis of other progressive clinical conditions that can be diagnosed and prognosed by using medical imaging.

  6. [Applications of spectral analysis technique to monitoring grasshoppers].

    Science.gov (United States)

    Lu, Hui; Han, Jian-guo; Zhang, Lu-da

    2008-12-01

    Grasshopper monitoring is of great significance in protecting environment and reducing economic loss. However, how to predict grasshoppers accurately and effectively is a difficult problem for a long time. In the present paper, the importance of forecasting grasshoppers and its habitat is expounded, and the development in monitoring grasshopper populations and the common arithmetic of spectral analysis technique are illustrated. Meanwhile, the traditional methods are compared with the spectral technology. Remote sensing has been applied in monitoring the living, growing and breeding habitats of grasshopper population, and can be used to develop a forecast model combined with GIS. The NDVI values can be analyzed throughout the remote sensing data and be used in grasshopper forecasting. Hyper-spectra remote sensing technique which can be used to monitor grasshoppers more exactly has advantages in measuring the damage degree and classifying damage areas of grasshoppers, so it can be adopted to monitor the spatial distribution dynamic of rangeland grasshopper population. Differentialsmoothing can be used to reflect the relations between the characteristic parameters of hyper-spectra and leaf area index (LAI), and indicate the intensity of grasshopper damage. The technology of near infrared reflectance spectroscopy has been employed in judging grasshopper species, examining species occurrences and monitoring hatching places by measuring humidity and nutrient of soil, and can be used to investigate and observe grasshoppers in sample research. According to this paper, it is concluded that the spectral analysis technique could be used as a quick and exact tool in monitoring and forecasting the infestation of grasshoppers, and will become an important means in such kind of research for their advantages in determining spatial orientation, information extracting and processing. With the rapid development of spectral analysis methodology, the goal of sustainable monitoring

  7. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  8. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  9. Image-analysis techniques for investigation localized corrosion processes

    International Nuclear Information System (INIS)

    Quinn, M.J.; Bailey, M.G.; Ikeda, B.M.; Shoesmith, D.W.

    1993-12-01

    We have developed a procedure for determining the mode and depth of penetration of localized corrosion by combining metallography and image analysis of corroded coupons. Two techniques, involving either a face-profiling or an edge-profiling procedure, have been developed. In the face-profiling procedure, successive surface grindings and image analyses were performed until corrosion was no longer visible. In this manner, the distribution of corroded sites on the surface and the total area of the surface corroded were determined as a function of depth into the specimen. In the edge-profiling procedure, surface grinding exposed successive cross sections of the corroded region. Image analysis of the cross section quantified the distribution of depths across the corroded section, and a three-dimensional distribution of penetration depths was obtained. To develop these procedures, we used artificially creviced Grade-2 titanium specimens that were corroded in saline solutions containing various amounts of chloride maintained at various fixed temperatures (105 to 150 degrees C) using a previously developed galvanic-coupling technique. We discuss some results from these experiments to illustrate how the procedures developed can be applied to a real corroded system. (author). 6 refs., 4 tabs., 21 figs

  10. Hospitals Productivity Measurement Using Data Envelopment Analysis Technique.

    Science.gov (United States)

    Torabipour, Amin; Najarzadeh, Maryam; Arab, Mohammad; Farzianpour, Freshteh; Ghasemzadeh, Roya

    2014-11-01

    This study aimed to measure the hospital productivity using data envelopment analysis (DEA) technique and Malmquist indices. This is a cross sectional study in which the panel data were used in a 4 year period from 2007 to 2010. The research was implemented in 12 teaching and non-teaching hospitals of Ahvaz County. Data envelopment analysis technique and the Malmquist indices with an input-orientation approach, was used to analyze the data and estimation of productivity. Data were analyzed using the SPSS.18 and DEAP.2 software. Six hospitals (50%) had a value lower than 1, which represents an increase in total productivity and other hospitals were non-productive. the average of total productivity factor (TPF) was 1.024 for all hospitals, which represents a decrease in efficiency by 2.4% from 2007 to 2010. The average technical, technologic, scale and managerial efficiency change was 0.989, 1.008, 1.028, and 0.996 respectively. There was not a significant difference in mean productivity changes among teaching and non-teaching hospitals (P>0.05) (except in 2009 years). Productivity rate of hospitals had an increasing trend generally. However, the total average of productivity was decreased in hospitals. Besides, between the several components of total productivity, variation of technological efficiency had the highest impact on reduce of total average of productivity.

  11. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Jondhale, K C; Waghmare, L M

    2010-01-01

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  12. Analysis of severe core damage accident progression for the heavy water reactor

    International Nuclear Information System (INIS)

    Tong Lili; Yuan Kai; Yuan Jingtian; Cao Xuewu

    2010-01-01

    In this study, the severe accident progression analysis of generic Canadian deuterium uranium reactor 6 was preliminarily provided using an integrated severe accident analysis code. The selected accident sequences were multiple steam generator tube rupture and large break loss-of-coolant accidents because these led to severe core damage with an assumed unavailability for several critical safety systems. The progressions of severe accident included a set of failed safety systems normally operated at full power, and initiative events led to primary heat transport system inventory blow-down or boil off. The core heat-up and melting, steam generator response,fuel channel and calandria vessel failure were analyzed. The results showed that the progression of a severe core damage accident induced by steam generator tube rupture or large break loss-of-coolant accidents in a CANDU reactor was slow due to heat sinks in the calandria vessel and vault. (authors)

  13. Missing data and censoring in the analysis of progression-free survival in oncology clinical trials.

    Science.gov (United States)

    Denne, J S; Stone, A M; Bailey-Iacona, R; Chen, T-T

    2013-01-01

    Progression-free survival (PFS) is increasingly used as a primary endpoint in oncology clinical trials. However, trial conduct is often such that PFS data on some patients may be partially missing either due to incomplete follow-up for progression, or due to data that may be collected but confounded by patients stopping randomized therapy or starting alternative therapy prior to progression. Regulatory guidance on how to handle these patients in the analysis and whether to censor these patients differs between agencies. We present results of a reanalysis of 28 Phase III trials from 12 companies or institutions performed by the Pharmaceutical Research and Manufacturers Association-sponsored PFS Expert Team. We show that analyses not adhering to the intention-to-treat principle tend to give hazard ratio estimates further from unity and describe several factors associated with this shift. We present illustrative simulations to support these findings and provide recommendations for the analysis of PFS.

  14. Limited vs extended face-lift techniques: objective analysis of intraoperative results.

    Science.gov (United States)

    Litner, Jason A; Adamson, Peter A

    2006-01-01

    To compare the intraoperative outcomes of superficial musculoaponeurotic system plication, imbrication, and deep-plane rhytidectomy techniques. Thirty-two patients undergoing primary deep-plane rhytidectomy participated. Each hemiface in all patients was submitted sequentially to 3 progressively more extensive lifts, while other variables were standardized. Four major outcome measures were studied, including the extent of skin redundancy and the repositioning of soft tissues along the malar, mandibular, and cervical vectors of lift. The amount of skin excess was measured without tension from the free edge to a point over the intertragal incisure, along a plane overlying the jawline. Using a soft tissue caliper, repositioning was examined by measurement of preintervention and immediate postintervention distances from dependent points to fixed anthropometric reference points. The mean skin excesses were 10.4, 12.8, and 19.4 mm for the plication, imbrication, and deep-plane lifts, respectively. The greatest absolute soft tissue repositioning was noted along the jawline, with the least in the midface. Analysis revealed significant differences from baseline and between lift types for each of the studied techniques in each of the variables tested. These data support the use of the deep-plane rhytidectomy technique to achieve a superior intraoperative lift relative to comparator techniques.

  15. Trabecular morphometry by fractal signature analysis is a novel marker of osteoarthritis progression.

    Science.gov (United States)

    Kraus, Virginia Byers; Feng, Sheng; Wang, ShengChu; White, Scott; Ainslie, Maureen; Brett, Alan; Holmes, Anthony; Charles, H Cecil

    2009-12-01

    To evaluate the effectiveness of using subchondral bone texture observed on a radiograph taken at baseline to predict progression of knee osteoarthritis (OA) over a 3-year period. A total of 138 participants in the Prediction of Osteoarthritis Progression study were evaluated at baseline and after 3 years. Fractal signature analysis (FSA) of the medial subchondral tibial plateau was performed on fixed flexion radiographs of 248 nonreplaced knees, using a commercially available software tool. OA progression was defined as a change in joint space narrowing (JSN) or osteophyte formation of 1 grade according to a standardized knee atlas. Statistical analysis of fractal signatures was performed using a new model based on correlating the overall shape of a fractal dimension curve with radius. Fractal signature of the medial tibial plateau at baseline was predictive of medial knee JSN progression (area under the curve [AUC] 0.75, of a receiver operating characteristic curve) but was not predictive of osteophyte formation or progression of JSN in the lateral compartment. Traditional covariates (age, sex, body mass index, knee pain), general bone mineral content, and joint space width at baseline were no more effective than random variables for predicting OA progression (AUC 0.52-0.58). The predictive model with maximum effectiveness combined fractal signature at baseline, knee alignment, traditional covariates, and bone mineral content (AUC 0.79). We identified a prognostic marker of OA that is readily extracted from a plain radiograph using FSA. Although the method needs to be validated in a second cohort, our results indicate that the global shape approach to analyzing these data is a potentially efficient means of identifying individuals at risk of knee OA progression.

  16. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2017-04-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical data set containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (I.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real integral-field spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of 2. Our analysis reveals that the algorithm prioritizes conservation of all the statistically significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BaTMAn is not to be used as a 'black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  17. Alcohol consumption and the neoplastic progression in Barrett's esophagus: a systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Zhifeng Lou

    Full Text Available PURPOSE: In the developed countries, the incidence of esophageal adenocarcinoma (EAC is increasing over recent decades. The purpose of this meta-analysis was to arrive at quantitative conclusions about the contribution of alcohol intakes and the progression of Barrett's esophagus. METHODS: A comprehensive, systematic bibliographic search of medical literature published up to Oct 2013 was conducted to identify relevant studies. A meta-analysis was conducted for alcohol consumption on the Barrett's esophagus progression. RESULTS: A total of 882 cases in 6,867 individuals from 14 observational studies were indemnified in this meta-analysis. The result of this current meta-analysis, including 10 case-control and 4 cohort studies, indicated that alcohol consumption was not associated with the neoplastic progression in Barrett's esophagus (RR, 1.17; 95% CI, 0.93-1.48. When stratified by the study designs, no significant association was detected in either high vs low group or ever vs never group. CONCLUSIONS: Alcohol drinking is not associated with risk of neoplastic progression in Barrett's esophagus. Further well designed studies are needed in this area.

  18. A meta-analysis on progressive atrophy in intractable temporal lobe epilepsy

    Science.gov (United States)

    Caciagli, Lorenzo; Bernasconi, Andrea; Wiebe, Samuel; Koepp, Matthias J.; Bernasconi, Neda

    2017-01-01

    Objective: It remains unclear whether drug-resistant temporal lobe epilepsy (TLE) is associated with cumulative brain damage, with no expert consensus and no quantitative syntheses of the available evidence. Methods: We conducted a systematic review and meta-analysis of MRI studies on progressive atrophy, searching PubMed and Ovid MEDLINE databases for cross-sectional and longitudinal quantitative MRI studies on drug-resistant TLE. Results: We screened 2,976 records and assessed eligibility of 248 full-text articles. Forty-two articles met the inclusion criteria for quantitative evaluation. We observed a predominance of cross-sectional studies, use of different clinical indices of progression, and high heterogeneity in age-control procedures. Meta-analysis of 18/1 cross-sectional/longitudinal studies on hippocampal atrophy (n = 979 patients) yielded a pooled effect size of r = −0.42 for ipsilateral atrophy related to epilepsy duration (95% confidence interval [CI] −0.51 to −0.32; p 80% of articles reported duration-related progression in extratemporal cortical and subcortical regions. Detailed analysis of study design features yielded low to moderate levels of evidence for progressive atrophy across studies, mainly due to dominance of cross-sectional over longitudinal investigations, use of diverse measures of seizure estimates, and absence of consistent age control procedures. Conclusions: While the neuroimaging literature is overall suggestive of progressive atrophy in drug-resistant TLE, published studies have employed rather weak designs to directly demonstrate it. Longitudinal multicohort studies are needed to unequivocally differentiate aging from disease progression. PMID:28687722

  19. In situ analytical techniques for battery interface analysis.

    Science.gov (United States)

    Tripathi, Alok M; Su, Wei-Nien; Hwang, Bing Joe

    2018-02-05

    Lithium-ion batteries, simply known as lithium batteries, are distinct among high energy density charge-storage devices. The power delivery of batteries depends upon the electrochemical performances and the stability of the electrode, electrolytes and their interface. Interfacial phenomena of the electrode/electrolyte involve lithium dendrite formation, electrolyte degradation and gas evolution, and a semi-solid protective layer formation at the electrode-electrolyte interface, also known as the solid-electrolyte interface (SEI). The SEI protects electrodes from further exfoliation or corrosion and suppresses lithium dendrite formation, which are crucial needs for enhancing the cell performance. This review covers the compositional, structural and morphological aspects of SEI, both artificially and naturally formed, and metallic dendrites using in situ/in operando cells and various in situ analytical tools. Critical challenges and the historical legacy in the development of in situ/in operando electrochemical cells with some reports on state-of-the-art progress are particularly highlighted. The present compilation pinpoints the emerging research opportunities in advancing this field and concludes on the future directions and strategies for in situ/in operando analysis.

  20. Use of decision analysis techniques to determine Hanford cleanup priorities

    International Nuclear Information System (INIS)

    Fassbender, L.; Gregory, R.; Winterfeldt, D. von; John, R.

    1992-01-01

    In January 1991, the U.S. Department of Energy (DOE) Richland Field Office, Westinghouse Hanford Company, and the Pacific Northwest Laboratory initiated the Hanford Integrated Planning Process (HIPP) to ensure that technically sound and publicly acceptable decisions are made that support the environmental cleanup mission at Hanford. One of the HIPP's key roles is to develop an understanding of the science and technology (S and T) requirements to support the cleanup mission. This includes conducting an annual systematic assessment of the S and T needs at Hanford to support a comprehensive technology development program and a complementary scientific research program. Basic to success is a planning and assessment methodology that is defensible from a technical perspective and acceptable to the various Hanford stakeholders. Decision analysis techniques were used to help identify and prioritize problems and S and T needs at Hanford. The approach used structured elicitations to bring many Hanford stakeholders into the process. Decision analysis, which is based on the axioms and methods of utility and probability theory, is especially useful in problems characterized by uncertainties and multiple objectives. Decision analysis addresses uncertainties by laying out a logical sequence of decisions, events, and consequences and by quantifying event and consequence probabilities on the basis of expert judgments

  1. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Directory of Open Access Journals (Sweden)

    Richard E.A. van Emmerik

    2016-03-01

    Full Text Available Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1 maintain pattern stability, (2 transition into new states, and (3 are governed by short- and long-term (fractal correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  2. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  3. Computational techniques for inelastic analysis and numerical experiments

    International Nuclear Information System (INIS)

    Yamada, Y.

    1977-01-01

    A number of formulations have been proposed for inelastic analysis, particularly for the thermal elastic-plastic creep analysis of nuclear reactor components. In the elastic-plastic regime, which principally concerns with the time independent behavior, the numerical techniques based on the finite element method have been well exploited and computations have become a routine work. With respect to the problems in which the time dependent behavior is significant, it is desirable to incorporate a procedure which is workable on the mechanical model formulation as well as the method of equation of state proposed so far. A computer program should also take into account the strain-dependent and/or time-dependent micro-structural changes which often occur during the operation of structural components at the increasingly high temperature for a long period of time. Special considerations are crucial if the analysis is to be extended to large strain regime where geometric nonlinearities predominate. The present paper introduces a rational updated formulation and a computer program under development by taking into account the various requisites stated above. (Auth.)

  4. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques.

    Science.gov (United States)

    Rosebrock, Adrian; Caban, Jesus J; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2013-03-29

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute's Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant.

  5. Techniques of production and analysis of polarized synchrotron radiation

    International Nuclear Information System (INIS)

    Mills, D.M.

    1992-01-01

    The use of the unique polarization properties of synchrotron radiation in the hard x-ray spectral region (E>3 KeV) is becoming increasingly important to many synchrotron radiation researchers. The radiation emitted from bending magnets and conventional (planar) insertion devices (IDs) is highly linearly polarized in the plane of the particle's orbit. Elliptically polarized x-rays can also be obtained by going off axis on a bending magnet source, albeit with considerable loss of flux. The polarization properties of synchrotron radiation can be further tailored to the researcher's specific needs through the use of specialized insertion devices such as helical and crossed undulators and asymmetrical wigglers. Even with the possibility of producing a specific polarization, there is still the need to develop x-ray optical components which can manipulate the polarization for both analysis and further modification of the polarization state. A survey of techniques for producing and analyzing both linear and circular polarized x-rays will be presented with emphasis on those techniques which rely on single crystal optical components

  6. Novel technique for coal pyrolysis and hydrogenation product analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pfefferle, L.D.; Boyle, J.

    1993-03-15

    A microjet reactor coupled to a VUV photoionization time-of-flight mass spectrometer has been used to obtain species measurements during high temperature pyrolysis and oxidation of a wide range of hydrocarbon compounds ranging from allene and acetylene to cyclohexane, benzene and toluene. Initial work focused on calibration of the technique, optimization of ion collection and detection and characterization of limitations. Using the optimized technique with 118 nm photoionization, intermediate species profiles were obtained for analysis of the hydrocarbon pyrolysis and oxidation mechanisms. The soft'' ionization, yielding predominantly molecular ions, allowed the study of reaction pathways in these high temperature systems where both sampling and detection challenges are severe. Work has focused on the pyrolysis and oxidative pyrolysis of aliphatic and aromatic hydrocarbon mixtures representative of coal pyrolysis and hydropyrolysis products. The detailed mass spectra obtained during pyrolysis and oxidation of hydrocarbon mixtures is especially important because of the complex nature of the product mixture even at short residence times and low primary reactant conversions. The combustion community has advanced detailed modeling of pyrolysis and oxidation to the C4 hydrocarbon level but in general above that size uncertainties in rate constant and thermodynamic data do not allow us to a priori predict products from mixed hydrocarbon pyrolyses using a detailed chemistry model. For pyrolysis of mixtures of coal-derived liquid fractions with a large range of compound structures and molecular weights in the hundreds of amu the modeling challenge is severe. Lumped models are possible from stable product data.

  7. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  8. Advances in zymography techniques and patents regarding protease analysis.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2012-08-01

    Detection of enzymatic activity on gel electrophoresis, namely zymography, is a technique that has received increasing attention in the last 10 years, according to the number of articles published. A growing amount of enzymes, mainly proteases, are now routinely detected by zymography. Detailed analytical studies are beginning to be published, as well as new patents have been developed. This new article updates the information covered in our last review, condensing the recent publications dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations. The new advances of this method are basically focused towards two dimensional zymography and transfer zymography. Though comparatively fewer patents have been published, they basically coincide in the study of matrix metalloproteases. The tendency is foreseen to be very productive in the area of zymoproteomics, combining electrophoresis and mass spectrometry for the analysis of proteases.

  9. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  10. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  11. SHOT PUT O’BRIAN TECHNIQUE, EXTENDING THE ANALYSIS OF TECHNIQUE FROM FOUR TO SIX PHASES WITH THE DESCRIPTION

    Directory of Open Access Journals (Sweden)

    Zlatan Saračević

    2011-09-01

    Full Text Available Due to the complexity of the motion, shot put technique is described in phases for easier analysis, easer learning of technique and error correction. It is complete so that in its implementation the transition from phase to phase is not noticed. In aforementioned and described phases of O'Brian spinal shot put technique a large distance, emptiness and disconnection appear between the initial position phase and a phase of overtaking the device, which in the training methods and training technique in primary and secondary education, as well as for students and athletes beginners in shot put represents a major problem regarding connecting, training and technique advancement. Therefore, this work is aimed at facilitating the methods of training of shot put technique, extending from four to six phases, which have been described and include the complete O'Brian technique.

  12. The Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project

    Science.gov (United States)

    Barnes, D.; Harrison, R. A.; Davies, J. A.; Perry, C. H.; Moestl, C.; Rouillard, A.; Bothmer, V.; Rodriguez, L.; Eastwood, J. P.; Kilpua, E.; Gallagher, P.; Odstrcil, D.

    2017-12-01

    Understanding solar wind evolution is fundamental to advancing our knowledge of energy and mass transport in the solar system, whilst also being crucial to space weather and its prediction. The advent of truly wide-angle heliospheric imaging has revolutionised the study of solar wind evolution, by enabling direct and continuous observation of both transient and background components of the solar wind as they propagate from the Sun to 1 AU and beyond. The recently completed, EU-funded FP7 Heliospheric Cataloguing, Analysis and Techniques Service (HELCATS) project (1st May 2014 - 30th April 2017) combined European expertise in heliospheric imaging, built up over the last decade in particular through leadership of the Heliospheric Imager (HI) instruments aboard NASA's STEREO mission, with expertise in solar and coronal imaging as well as the interpretation of in-situ and radio diagnostic measurements of solar wind phenomena. HELCATS involved: (1) the cataloguing of transient (coronal mass ejections) and background (stream/corotating interaction regions) solar wind structures observed by the STEREO/HI instruments, including estimates of their kinematic properties based on a variety of modelling techniques; (2) the verification of these kinematic properties through comparison with solar source observations and in-situ measurements at multiple points throughout the heliosphere; (3) the assessment of the potential for initialising numerical models based on the derived kinematic properties of transient and background solar wind components; and (4) the assessment of the complementarity of radio observations (Type II radio bursts and interplanetary scintillation) in the detection and analysis of heliospheric structure in combination with heliospheric imaging observations. In this presentation, we provide an overview of the HELCATS project emphasising, in particular, the principal achievements and legacy of this unprecedented project.

  13. Video x-ray progressive scanning: new technique for decreasing x-ray exposure without decreasing image quality during cardiac catheterization

    International Nuclear Information System (INIS)

    Holmes, D.R. Jr.; Bove, A.A.; Wondrow, M.A.; Gray, J.E.

    1986-01-01

    A newly developed video x-ray progressive scanning system improves image quality, decreases radiation exposure, and can be added to any pulsed fluoroscopic x-ray system using a video display without major system modifications. With use of progressive video scanning, the radiation entrance exposure rate measured with a vascular phantom was decreased by 32 to 53% in comparison with a conventional fluoroscopic x-ray system. In addition to this substantial decrease in radiation exposure, the quality of the image was improved because of less motion blur and artifact. Progressive video scanning has the potential for widespread application to all pulsed fluoroscopic x-ray systems. Use of this technique should make cardiac catheterization procedures and all other fluoroscopic procedures safer for the patient and the involved medical and paramedical staff

  14. Impact of advanced microstructural characterization techniques on modeling and analysis of radiation damage

    International Nuclear Information System (INIS)

    Garner, F.A.; Odette, G.R.

    1980-01-01

    The evolution of radiation-induced alterations of dimensional and mechanical properties has been shown to be a direct and often predictable consequence of radiation-induced microstructural changes. Recent advances in understanding of the nature and role of each microstructural component in determining the property of interest has led to a reappraisal of the type and priority of data needed for further model development. This paper presents an overview of the types of modeling and analysis activities in progress, the insights that prompted these activities, and specific examples of successful and ongoing efforts. A review is presented of some problem areas that in the authors' opinion are not yet receiving sufficient attention and which may benefit from the application of advanced techniques of microstructural characterization. Guidelines based on experience gained in previous studies are also provided for acquisition of data in a form most applicable to modeling needs

  15. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  16. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  17. Optimized inspection techniques and structural analysis in lifetime management

    International Nuclear Information System (INIS)

    Aguado, M.T.; Marcelles, I.

    1993-01-01

    Preservation of the option of extending the service lifetime of a nuclear power plant beyond its normal design lifetime requires correct remaining lifetime management from the very beginning of plant operation. The methodology used in plant remaining lifetime management is essentially based on the use of standard inspections, surveillance and monitoring programs and calculations, such as thermal-stress and fracture mechanics analysis. The inspection techniques should be continuously optimized, in order to be able to detect and dimension existing defects with the highest possible degree of accuracy. The information obtained during the inspection is combined with the historical data of the components: design, quality, operation, maintenance, and transients, and with the results of destructive testing, fracture mechanics and thermal fatigue analysis. These data are used to estimate the remaining lifetime of nuclear power plant components, systems and structures with the highest degree possible of accuracy. The use of this methodology allows component repairs and replacements to be reduced or avoided and increases the safety levels and availability of the nuclear power plant. Use of this strategy avoids the need for heavy investments at the end of the licensing period

  18. Analysis of Muscle Fatigue Progression using Cyclostationary Property of Surface Electromyography Signals.

    Science.gov (United States)

    Karthick, P A; Venugopal, G; Ramakrishnan, S

    2016-01-01

    Analysis of neuromuscular fatigue finds various applications ranging from clinical studies to biomechanics. Surface electromyography (sEMG) signals are widely used for these studies due to its non-invasiveness. During cyclic dynamic contractions, these signals are nonstationary and cyclostationary. In recent years, several nonstationary methods have been employed for the muscle fatigue analysis. However, cyclostationary based approach is not well established for the assessment of muscle fatigue. In this work, cyclostationarity associated with the biceps brachii muscle fatigue progression is analyzed using sEMG signals and Spectral Correlation Density (SCD) functions. Signals are recorded from fifty healthy adult volunteers during dynamic contractions under a prescribed protocol. These signals are preprocessed and are divided into three segments, namely, non-fatigue, first muscle discomfort and fatigue zones. Then SCD is estimated using fast Fourier transform accumulation method. Further, Cyclic Frequency Spectral Density (CFSD) is calculated from the SCD spectrum. Two features, namely, cyclic frequency spectral area (CFSA) and cyclic frequency spectral entropy (CFSE) are proposed to study the progression of muscle fatigue. Additionally, degree of cyclostationarity (DCS) is computed to quantify the amount of cyclostationarity present in the signals. Results show that there is a progressive increase in cyclostationary during the progression of muscle fatigue. CFSA shows an increasing trend in muscle fatiguing contraction. However, CFSE shows a decreasing trend. It is observed that when the muscle progresses from non-fatigue to fatigue condition, the mean DCS of fifty subjects increases from 0.016 to 0.99. All the extracted features found to be distinct and statistically significant in the three zones of muscle contraction (p < 0.05). It appears that these SCD features could be useful in the automated analysis of sEMG signals for different neuromuscular conditions.

  19. Machine Learning Techniques for Arterial Pressure Waveform Analysis

    Directory of Open Access Journals (Sweden)

    João Cardoso

    2013-05-01

    Full Text Available The Arterial Pressure Waveform (APW can provide essential information about arterial wall integrity and arterial stiffness. Most of APW analysis frameworks individually process each hemodynamic parameter and do not evaluate inter-dependencies in the overall pulse morphology. The key contribution of this work is the use of machine learning algorithms to deal with vectorized features extracted from APW. With this purpose, we follow a five-step evaluation methodology: (1 a custom-designed, non-invasive, electromechanical device was used in the data collection from 50 subjects; (2 the acquired position and amplitude of onset, Systolic Peak (SP, Point of Inflection (Pi and Dicrotic Wave (DW were used for the computation of some morphological attributes; (3 pre-processing work on the datasets was performed in order to reduce the number of input features and increase the model accuracy by selecting the most relevant ones; (4 classification of the dataset was carried out using four different machine learning algorithms: Random Forest, BayesNet (probabilistic, J48 (decision tree and RIPPER (rule-based induction; and (5 we evaluate the trained models, using the majority-voting system, comparatively to the respective calculated Augmentation Index (AIx. Classification algorithms have been proved to be efficient, in particular Random Forest has shown good accuracy (96.95% and high area under the curve (AUC of a Receiver Operating Characteristic (ROC curve (0.961. Finally, during validation tests, a correlation between high risk labels, retrieved from the multi-parametric approach, and positive AIx values was verified. This approach gives allowance for designing new hemodynamic morphology vectors and techniques for multiple APW analysis, thus improving the arterial pulse understanding, especially when compared to traditional single-parameter analysis, where the failure in one parameter measurement component, such as Pi, can jeopardize the whole evaluation.

  20. The analysis of gastric function using computational techniques

    International Nuclear Information System (INIS)

    Young, Paul

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of the study was (i) to assess the feasibility of using the motility program in a volunteer study and (ii) to determine the effects of the meals on motility. The results showed that the parameters were remarkably consistent between the 4 meals. However, for each meal, velocity and percentage occlusion were found to increase as contractions propagated along the antrum. The first clinical application of the motility program was carried out in Study 2. Motility from three patients was measured, after they had been referred to the Magnetic Resonance Centre with gastric problems. The results showed that one of the patients displayed an irregular motility, compared to the results of the volunteer study. This result had not been observed using other investigative techniques. In Study 3, motility was measured in Low Viscosity and High Viscosity liquid/solid meals, with the solid particulate consisting of agar beads of varying breakdown strength. The results showed that

  1. Tracking progress towards global drinking water and sanitation targets: A within and among country analysis.

    Science.gov (United States)

    Fuller, James A; Goldstick, Jason; Bartram, Jamie; Eisenberg, Joseph N S

    2016-01-15

    Global access to safe drinking water and sanitation has improved dramatically during the Millennium Development Goal (MDG) period. However, there is substantial heterogeneity in progress between countries and inequality within countries. We assessed countries' temporal patterns in access to drinking water and sanitation using publicly available data. We then classified countries using non-linear modeling techniques as having one of the following trajectories: 100% coverage, linear growth, linear decline, no change, saturation, acceleration, deceleration, negative acceleration, or negative deceleration. We further assessed the degree to which temporal profiles follow a sigmoidal pattern and how these patterns might vary within a given country between rural and urban settings. Among countries with more than 10 data points, between 15% and 38% showed a non-linear trajectory, depending on the indicator. Overall, countries' progress followed a sigmoidal trend, but some countries are making better progress and some worse progress than would be expected. We highlight several countries that are not on track to meet the MDG for water or sanitation, but whose access is accelerating, suggesting better performance during the coming years. Conversely, we also highlight several countries that have made sufficient progress to meet the MDG target, but in which access is decelerating. Patterns were heterogeneous and non-linearity was common. Characterization of these heterogeneous patterns will help policy makers allocate resources more effectively. For example, policy makers can identify countries that could make use of additional resources or might be in need of additional institutional capacity development to properly manage resources; this will be essential to meet the forthcoming Sustainable Development Goals. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  3. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    International Nuclear Information System (INIS)

    Lungaroni, M.; Peluso, E.; Gelfusa, M.; Malizia, A.; Talebzadeh, S.; Gaudio, P.; Murari, A.; Vega, J.

    2016-01-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  4. Genome-wide analysis of disease progression in age-related macular degeneration.

    Science.gov (United States)

    Yan, Qi; Ding, Ying; Liu, Yi; Sun, Tao; Fritsche, Lars G; Clemons, Traci; Ratnapriya, Rinki; Klein, Michael L; Cook, Richard J; Liu, Yu; Fan, Ruzong; Wei, Lai; Abecasis, Gonçalo R; Swaroop, Anand; Chew, Emily Y; Weeks, Daniel E; Chen, Wei

    2018-03-01

    Family- and population-based genetic studies have successfully identified multiple disease-susceptibility loci for Age-related macular degeneration (AMD), one of the first batch and most successful examples of genome-wide association study. However, most genetic studies to date have focused on case-control studies of late AMD (choroidal neovascularization or geographic atrophy). The genetic influences on disease progression are largely unexplored. We assembled unique resources to perform a genome-wide bivariate time-to-event analysis to test for association of time-to-late-AMD with ∼9 million variants on 2721 Caucasians from a large multi-center randomized clinical trial, the Age-Related Eye Disease Study. To our knowledge, this is the first genome-wide association study of disease progression (bivariate survival outcome) in AMD genetic studies, thus providing novel insights to AMD genetics. We used a robust Cox proportional hazards model to appropriately account for between-eye correlation when analyzing the progression time in the two eyes of each participant. We identified four previously reported susceptibility loci showing genome-wide significant association with AMD progression: ARMS2-HTRA1 (P = 8.1 × 10-43), CFH (P = 3.5 × 10-37), C2-CFB-SKIV2L (P = 8.1 × 10-10) and C3 (P = 1.2 × 10-9). Furthermore, we detected association of rs58978565 near TNR (P = 2.3 × 10-8), rs28368872 near ATF7IP2 (P = 2.9 × 10-8) and rs142450006 near MMP9 (P = 0.0006) with progression to choroidal neovascularization but not geographic atrophy. Secondary analysis limited to 34 reported risk variants revealed that LIPC and CTRB2-CTRB1 were also associated with AMD progression (P < 0.0015). Our genome-wide analysis thus expands the genetics in both development and progression of AMD and should assist in early identification of high risk individuals.

  5. Automatic Satellite Telemetry Analysis for SSA using Artificial Intelligence Techniques

    Science.gov (United States)

    Stottler, R.; Mao, J.

    In April 2016, General Hyten, commander of Air Force Space Command, announced the Space Enterprise Vision (SEV) (http://www.af.mil/News/Article-Display/Article/719941/hyten-announces-space-enterprise-vision/). The SEV addresses increasing threats to space-related systems. The vision includes an integrated approach across all mission areas (communications, positioning, navigation and timing, missile warning, and weather data) and emphasizes improved access to data across the entire enterprise and the ability to protect space-related assets and capabilities. "The future space enterprise will maintain our nation's ability to deliver critical space effects throughout all phases of conflict," Hyten said. Satellite telemetry is going to become available to a new audience. While that telemetry information should be valuable for achieving Space Situational Awareness (SSA), these new satellite telemetry data consumers will not know how to utilize it. We were tasked with applying AI techniques to build an infrastructure to process satellite telemetry into higher abstraction level symbolic space situational awareness and to initially populate that infrastructure with useful data analysis methods. We are working with two organizations, Montana State University (MSU) and the Air Force Academy, both of whom control satellites and therefore currently analyze satellite telemetry to assess the health and circumstances of their satellites. The design which has resulted from our knowledge elicitation and cognitive task analysis is a hybrid approach which combines symbolic processing techniques of Case-Based Reasoning (CBR) and Behavior Transition Networks (BTNs) with current Machine Learning approaches. BTNs are used to represent the process and associated formulas to check telemetry values against anticipated problems and issues. CBR is used to represent and retrieve BTNs that represent an investigative process that should be applied to the telemetry in certain circumstances

  6. Rates of progression in diabetic retinopathy during different time periods: a systematic review and meta-analysis

    DEFF Research Database (Denmark)

    Wong, Tien Y; Mwamburi, Mkaya; Klein, Ronald

    2009-01-01

    This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends.......This meta-analysis reviews rates of progression of diabetic retinopathy to proliferative diabetic retinopathy (PDR) and/or severe visual loss (SVL) and temporal trends....

  7. Pregnancy and HIV disease progression: a systematic review and meta-analysis.

    Science.gov (United States)

    Calvert, Clara; Ronsmans, Carine

    2015-02-01

    To assess whether pregnancy accelerates HIV disease progression. Studies comparing progression to HIV-related illness, low CD4 count, AIDS-defining illness, HIV-related death, or any death in HIV-infected pregnant and non-pregnant women were included. Relative risks (RR) for each outcome were combined using random effects meta-analysis and were stratified by antiretroviral therapy (ART) availability. 15 studies met the inclusion criteria. Pregnancy was not associated with progression to HIV-related illness [summary RR: 1.32, 95% confidence interval (CI): 0.66-2.61], AIDS-defining illness (summary RR: 0.97, 95% CI: 0.74-1.25) or mortality (summary RR: 0.97, 95% CI: 0.62-1.53), but there was an association with low CD4 counts (summary RR: 1.41, 95% CI: 0.99-2.02) and HIV-related death (summary RR: 1.65, 95% CI: 1.06-2.57). In settings where ART was available, there was no evidence that pregnancy accelerated progress to HIV/AIDS-defining illnesses, death and drop in CD4 count. In settings without ART availability, effect estimates were consistent with pregnancy increasing the risk of progression to HIV/AIDS-defining illnesses and HIV-related or all-cause mortality, but there were too few studies to draw meaningful conclusions. In the absence of ART, pregnancy is associated with small but appreciable increases in the risk of several negative HIV outcomes, but the evidence is too weak to draw firm conclusions. When ART is available, the effects of pregnancy on HIV disease progression are attenuated and there is little reason to discourage healthy HIV-infected women who desire to become pregnant from doing so. © 2014 John Wiley & Sons Ltd.

  8. Nonlinear analysis of the progressive collapse of reinforced concrete plane frames using a multilayered beam formulation

    Directory of Open Access Journals (Sweden)

    C. E. M. Oliveira

    Full Text Available This work investigates the response of two reinforced concrete (RC plane frames after the loss of a column and their potential resistance for progressive collapse. Nonlinear dynamic analysis is performed using a multilayered Euler/Bernoulli beam element, including elasto-viscoplastic effects. The material nonlinearity is represented using one-dimensional constitutive laws in the material layers, while geometrical nonlinearities are incorporated within a corotational beam formulation. The frames were designed in accordance with the minimum requirements proposed by the reinforced concrete design/building codes of Europe (fib [1-2], Eurocode 2 [3] and Brazil (NBR 6118 [4]. The load combinations considered for PC analysis follow the prescriptions of DoD [5]. The work verifies if the minimum requirements of the considered codes are sufficient for enforcing structural safety and robustness, and also points out the major differences in terms of progressive collapse potential of the corresponding designed structures.

  9. Annual progress report 1981

    International Nuclear Information System (INIS)

    1982-01-01

    This annual progress report of the CEA Protection and Nuclear Safety Institut outlines a brief description of the progress made in each section of the Institut. Research activities of the Protection department include, radiation effects on man, radioecology and environment radioprotection techniques. Research activities of the Nuclear Safety department include, reactor safety analysis, fuel cycle facilities safety analysis, safety research programs. The third section deals with nuclear material security including security of facilities, security of nuclear material transport and monitoring of nuclear material management [fr

  10. Clinical evaluation of a novel population-based regression analysis for detecting glaucomatous visual field progression.

    Science.gov (United States)

    Kovalska, M P; Bürki, E; Schoetzau, A; Orguel, S F; Orguel, S; Grieshaber, M C

    2011-04-01

    The distinction of real progression from test variability in visual field (VF) series may be based on clinical judgment, on trend analysis based on follow-up of test parameters over time, or on identification of a significant change related to the mean of baseline exams (event analysis). The aim of this study was to compare a new population-based method (Octopus field analysis, OFA) with classic regression analyses and clinical judgment for detecting glaucomatous VF changes. 240 VF series of 240 patients with at least 9 consecutive examinations available were included into this study. They were independently classified by two experienced investigators. The results of such a classification served as a reference for comparison for the following statistical tests: (a) t-test global, (b) r-test global, (c) regression analysis of 10 VF clusters and (d) point-wise linear regression analysis. 32.5 % of the VF series were classified as progressive by the investigators. The sensitivity and specificity were 89.7 % and 92.0 % for r-test, and 73.1 % and 93.8 % for the t-test, respectively. In the point-wise linear regression analysis, the specificity was comparable (89.5 % versus 92 %), but the sensitivity was clearly lower than in the r-test (22.4 % versus 89.7 %) at a significance level of p = 0.01. A regression analysis for the 10 VF clusters showed a markedly higher sensitivity for the r-test (37.7 %) than the t-test (14.1 %) at a similar specificity (88.3 % versus 93.8 %) for a significant trend (p = 0.005). In regard to the cluster distribution, the paracentral clusters and the superior nasal hemifield progressed most frequently. The population-based regression analysis seems to be superior to the trend analysis in detecting VF progression in glaucoma, and may eliminate the drawbacks of the event analysis. Further, it may assist the clinician in the evaluation of VF series and may allow better visualization of the correlation between function and structure owing to VF

  11. Surrogate marker analysis in cancer clinical trials through time-to-event mediation techniques.

    Science.gov (United States)

    Vandenberghe, Sjouke; Duchateau, Luc; Slaets, Leen; Bogaerts, Jan; Vansteelandt, Stijn

    2017-01-01

    The meta-analytic approach is the gold standard for validation of surrogate markers, but has the drawback of requiring data from several trials. We refine modern mediation analysis techniques for time-to-event endpoints and apply them to investigate whether pathological complete response can be used as a surrogate marker for disease-free survival in the EORTC 10994/BIG 1-00 randomised phase 3 trial in which locally advanced breast cancer patients were randomised to either taxane or anthracycline based neoadjuvant chemotherapy. In the mediation analysis, the treatment effect is decomposed into an indirect effect via pathological complete response and the remaining direct effect. It shows that only 4.2% of the treatment effect on disease-free survival after five years is mediated by the treatment effect on pathological complete response. There is thus no evidence from our analysis that pathological complete response is a valuable surrogate marker to evaluate the effect of taxane versus anthracycline based chemotherapies on progression free survival of locally advanced breast cancer patients. The proposed analysis strategy is broadly applicable to mediation analyses of time-to-event endpoints, is easy to apply and outperforms existing strategies in terms of precision as well as robustness against model misspecification.

  12. Analysis of the progression of systolic blood pressure using imputation of missing phenotype values

    OpenAIRE

    Vaitsiakhovich, Tatsiana; Drichel, Dmitriy; Angisch, Marina; Becker, Tim; Herold, Christine; Lacour, André

    2014-01-01

    We present a genome-wide association study of a quantitative trait, "progression of systolic blood pressure in time," in which 142 unrelated individuals of the Genetic Analysis Workshop 18 real genotype data were analyzed. Information on systolic blood pressure and other phenotypic covariates was missing at certain time points for a considerable part of the sample. We observed that the dropout process causing missingness is not independent of the initial systolic blood pressure; that is, the ...

  13. Improved mesh based photon sampling techniques for neutron activation analysis

    International Nuclear Information System (INIS)

    Relson, E.; Wilson, P. P. H.; Biondo, E. D.

    2013-01-01

    The design of fusion power systems requires analysis of neutron activation of large, complex volumes, and the resulting particles emitted from these volumes. Structured mesh-based discretization of these problems allows for improved modeling in these activation analysis problems. Finer discretization of these problems results in large computational costs, which drives the investigation of more efficient methods. Within an ad hoc subroutine of the Monte Carlo transport code MCNP, we implement sampling of voxels and photon energies for volumetric sources using the alias method. The alias method enables efficient sampling of a discrete probability distribution, and operates in 0(1) time, whereas the simpler direct discrete method requires 0(log(n)) time. By using the alias method, voxel sampling becomes a viable alternative to sampling space with the 0(1) approach of uniformly sampling the problem volume. Additionally, with voxel sampling it is straightforward to introduce biasing of volumetric sources, and we implement this biasing of voxels as an additional variance reduction technique that can be applied. We verify our implementation and compare the alias method, with and without biasing, to direct discrete sampling of voxels, and to uniform sampling. We study the behavior of source biasing in a second set of tests and find trends between improvements and source shape, material, and material density. Overall, however, the magnitude of improvements from source biasing appears to be limited. Future work will benefit from the implementation of efficient voxel sampling - particularly with conformal unstructured meshes where the uniform sampling approach cannot be applied. (authors)

  14. Trends in grazing emission x-ray analysis techniques

    International Nuclear Information System (INIS)

    Grieken, R. van; Tsuji, K.; Injuk, J.

    2000-01-01

    then, the detection limits imposed by the semiconductor industry roadmap can probably not be obtained by tube-excited GEXRF. Th perspectives for tube-excited GE-XRF are thus rather poor. Future developments imply the combination of GEXRF with synchrotron radiation excitation. Grazing-emission particle-induced X-ray emission (GE-PIXE) suffers of similar quantification Problems for material deposited on a carrier, but it makes PIXE a surface-sensitive technique, while normally the protons penetrate some tens of μm in the sample. Similarly, grazing-emission electron probe micro-analysis (GE-EPNIA) allows to selectively analyze particles on a flat carrier, allows surface sensitivities in the nm rather than μ range, and yields, in principle, a spatial resolution for chemical analysis similar to the size of the impinging electron beam, rather than of the electron-excited volume. Both GE-PIXE and GE-EPMA need to be explored more fully in the near future. (author)

  15. Romanian medieval earring analysis by X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Therese, Laurent; Guillot, Philippe; Muja, Cristina

    2011-01-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100μm. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two large earrings

  16. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  17. Elemental analysis of brazing alloy samples by neutron activation technique

    International Nuclear Information System (INIS)

    Eissa, E.A.; Rofail, N.B.; Hassan, A.M.; El-Shershaby, A.; Walley El-Dine, N.

    1996-01-01

    Two brazing alloy samples (C P 2 and C P 3 ) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10 1 1 n/cm 2 /s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10 1 2 n/cm 2 /s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab

  18. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    Savage, J.M.; Fernandez, R.F.; Zhang, W.; Robertson, J.D.; Majidi, V.

    1995-01-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag + , Ba 2+ , and Cd 2+ in the concentration range from 10 ng/g to 1 μg/g; for Cu 2+ and Pb 2+ from 10 ng/g to 5 μg/g; and for Hg 2+ from 10 ng/g to 10 μg/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 μg/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag + , Ba 2+ , Cd 2+ , Cu 2+ , and Pb 2+ , share common binding sites with binding efficiencies varying in the sequence of Pb 2+ >Cu 2+ >Ag 2+ >Cd 2+ >Ba 2+ . The binding of Hg 2+ involved a different binding site with an increase in binding efficiency in the presence of Ag + . (orig.)

  19. A novel preconcentration technique for the PIXE analysis of water

    International Nuclear Information System (INIS)

    Savage, J.M.; Robertson, J.D.; Majidi, V.

    1994-01-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. 5 mg of dried algae powder were mixed with 5 mL of single- and multi-metal solutions. The algae cells were then collected by filtration on 0.6 um polycarbonate membranes and analyzed by PIXE using a dual energy irradiation. When C. vulgatis was used on mixed metal solutions, linear responses were observed for Ag + , Ba 2+ , and Cd 2+ in the concentration range from 10 ng/g to 1 ug/g; for Cu 2+ and Pb 2+ from 10 ng/g to 5 ug/g; and for Hg 2+ from 10 ng/g to 10 ug/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 ug/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium is also replaced

  20. Seismic margin analysis technique for nuclear power plant structures

    International Nuclear Information System (INIS)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed

  1. Analysis of Program Obfuscation Schemes with Variable Encoding Technique

    Science.gov (United States)

    Fukushima, Kazuhide; Kiyomoto, Shinsaku; Tanaka, Toshiaki; Sakurai, Kouichi

    Program analysis techniques have improved steadily over the past several decades, and software obfuscation schemes have come to be used in many commercial programs. A software obfuscation scheme transforms an original program or a binary file into an obfuscated program that is more complicated and difficult to analyze, while preserving its functionality. However, the security of obfuscation schemes has not been properly evaluated. In this paper, we analyze obfuscation schemes in order to clarify the advantages of our scheme, the XOR-encoding scheme. First, we more clearly define five types of attack models that we defined previously, and define quantitative resistance to these attacks. Then, we compare the security, functionality and efficiency of three obfuscation schemes with encoding variables: (1) Sato et al.'s scheme with linear transformation, (2) our previous scheme with affine transformation, and (3) the XOR-encoding scheme. We show that the XOR-encoding scheme is superior with regard to the following two points: (1) the XOR-encoding scheme is more secure against a data-dependency attack and a brute force attack than our previous scheme, and is as secure against an information-collecting attack and an inverse transformation attack as our previous scheme, (2) the XOR-encoding scheme does not restrict the calculable ranges of programs and the loss of efficiency is less than in our previous scheme.

  2. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    Directory of Open Access Journals (Sweden)

    Ricardo E. Izzo

    2011-06-01

    Full Text Available The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball, with 20.9 % – and by the two-handed over the head pass, with 18.2 %, and finally, one- or two-handed indirect passes (bounces, with 11.2 % and 9.8 %. Considering the most used pass in basketball, from the biomechanical point of view, the muscles involved in the correct movement consider all the muscles of the upper extremity, adding also the shoulder muscles as well as the body fixators (abdominals, hip flexors, knee extensors, and dorsal flexors of the foot. The technical and conditional analysis considers the throwing speed, the throw height and the air resistance. In conclusion, the aim of this study is to give some guidelines to improve the mechanical execution of the movements in training, without neglecting the importance of the harmony of the movements themselves.

  3. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis

    International Nuclear Information System (INIS)

    Mohamed, A.

    1998-01-01

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results

  4. Elemental analysis of brazing alloy samples by neutron activation technique

    Energy Technology Data Exchange (ETDEWEB)

    Eissa, E A; Rofail, N B; Hassan, A M [Reactor and Neutron physics Department, Nuclear Research Centre, Atomic Energy Authority, Cairo (Egypt); El-Shershaby, A; Walley El-Dine, N [Physics Department, Faculty of Girls, Ain Shams Universty, Cairo (Egypt)

    1997-12-31

    Two brazing alloy samples (C P{sup 2} and C P{sup 3}) have been investigated by Neutron activation analysis (NAA) technique in order to identify and estimate their constituent elements. The pneumatic irradiation rabbit system (PIRS), installed at the first egyptian research reactor (ETRR-1) was used for short-time irradiation (30 s) with a thermal neutron flux of 1.6 x 10{sup 1}1 n/cm{sup 2}/s in the reactor reflector, where the thermal to epithermal neutron flux ratio is 106. Long-time irradiation (48 hours) was performed at reactor core periphery with thermal neutron flux of 3.34 x 10{sup 1}2 n/cm{sup 2}/s, and thermal to epithermal neutron flux ratio of 79. Activation by epithermal neutrons was taken into account for the (1/v) and resonance neutron absorption in both methods. A hyper pure germanium detection system was used for gamma-ray acquisitions. The concentration values of Al, Cr, Fe, Co, Cu, Zn, Se, Ag and Sb were estimated as percentages of the sample weight and compared with reported values. 1 tab.

  5. EG-13GENOME-WIDE METHYLATION ANALYSIS IDENTIFIES GENOMIC DNA DEMETHYLATION DURING MALIGNANT PROGRESSION OF GLIOMAS

    Science.gov (United States)

    Saito, Kuniaki; Mukasa, Akitake; Nagae, Genta; Aihara, Koki; Otani, Ryohei; Takayanagi, Shunsaku; Omata, Mayu; Tanaka, Shota; Shibahara, Junji; Takahashi, Miwako; Momose, Toshimitsu; Shimamura, Teppei; Miyano, Satoru; Narita, Yoshitaka; Ueki, Keisuke; Nishikawa, Ryo; Nagane, Motoo; Aburatani, Hiroyuki; Saito, Nobuhito

    2014-01-01

    Low-grade gliomas often undergo malignant progression, and these transformations are a leading cause of death in patients with low-grade gliomas. However, the molecular mechanisms underlying malignant tumor progression are still not well understood. Recent evidence indicates that epigenetic deregulation is an important cause of gliomagenesis; therefore, we examined the impact of epigenetic changes during malignant progression of low-grade gliomas. Specifically, we used the Illumina Infinium Human Methylation 450K BeadChip to perform genome-wide DNA methylation analysis of 120 gliomas and four normal brains. This study sample included 25 matched-pairs of initial low-grade gliomas and recurrent tumors (temporal heterogeneity) and 20 of the 25 recurring tumors recurred as malignant progressions, and one matched-pair of newly emerging malignant lesions and pre-existing lesions (spatial heterogeneity). Analyses of methylation profiles demonstrated that most low-grade gliomas in our sample (43/51; 84%) had a CpG island methylator phenotype (G-CIMP). Remarkably, approximately 50% of secondary glioblastomas that had progressed from low-grade tumors with the G-CIMP status exhibited a characteristic partial demethylation of genomic DNA during malignant progression, but other recurrent gliomas showed no apparent change in DNA methylation pattern. Interestingly, we found that most loci that were demethylated during malignant progression were located outside of CpG islands. The information of histone modifications patterns in normal human astrocytes and embryonal stem cells also showed that the ratio of active marks at the site corresponding to DNA demethylated loci in G-CIMP-demethylated tumors was significantly lower; this finding indicated that most demethylated loci in G-CIMP-demethylated tumors were likely transcriptionally inactive. A small number of the genes that were upregulated and had demethylated CpG islands were associated with cell cycle-related pathway. In

  6. Ultraviolet-Visible and Fluorescence Spectroscopy Techniques Are Important Diagnostic Tools during the Progression of Atherosclerosis: Diet Zinc Supplementation Retarded or Delayed Atherosclerosis

    Science.gov (United States)

    Abdelhalim, Mohamed Anwar K.; Moussa, Sherif A. Abdelmottaleb; AL-Mohy, Yanallah Hussain

    2013-01-01

    Background. In this study, we examined whether UV-visible and fluorescence spectroscopy techniques detect the progression of atherosclerosis in serum of rabbits fed on high-cholesterol diet (HCD) and HCD supplemented with zinc (HCD + Zn) compared with the control. Methods. The control rabbits group was fed on 100 g/day of normal diet. The HCD group was fed on Purina Certified Rabbit Chow supplemented with 1.0% cholesterol plus 1.0% olive oil (100 g/day) for the same period. The HCD + Zn group was fed on normal Purina Certified Rabbit Chow plus 1.0% cholesterol and 1.0% olive oil supplemented with 470 ppm Zn for the same feeding period. UV-visible and fluorescence spectroscopy and biochemistry in Rabbit's blood serum and blood hematology were measured in Rabbit's blood. Results. We found that the fluorescent peak of HCD shifted toward UV-visible wavelength compared with the control using fluorescent excitation of serum at 192 nm. In addition, they showed that supplementation of zinc (350 ppm) restored the fluorescent peak closely to the control. By using UV-visible spectroscopy approach, we found that the peak absorbance of HCD (about 280 nm) was higher than that of control and that zinc supplementation seemed to decrease the absorbance. Conclusions. This study demonstrates that ultraviolet-visible and fluorescence spectroscopy techniques can be applied as noninvasive techniques on a sample blood serum for diagnosing or detecting the progression of atherosclerosis. The Zn supplementation to rabbits fed on HCD delays or retards the progression of atherosclerosis. Inducing anemia in rabbits fed on HCD delays the progression of atherosclerosis. PMID:24350281

  7. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  8. Improved X-ray diagnosis of stomach by progress in the development of contrast media and examination techniques

    International Nuclear Information System (INIS)

    Lotz, W.

    1982-01-01

    Three factors have been responsible for the advances during the past few years in X-ray examination of the stomach: Improvement of the contrast media used; introduction of the rare-earth foils; and examination techniques imaging all sections of the stomach and of the duodenal bulb under hypotension in double-contrast technique, in complete filling, and imaging the accessible sections by means of proper compression. An interesting technique employs a combination of two different barium sulphate suspension used at the same time, e.g. Bubbly Barium or some other barium sulphate preparation with a small amount of High-Density Barium yielding excellent image of the gastric mucosa (technique with two contrast media). (orig.) [de

  9. Who pays for healthcare in Bangladesh? An analysis of progressivity in health systems financing.

    Science.gov (United States)

    Molla, Azaher Ali; Chi, Chunhuei

    2017-09-06

    The relationship between payments towards healthcare and ability to pay is a measure of financial fairness. Analysis of progressivity is important from an equity perspective as well as for macroeconomic and political analysis of healthcare systems. Bangladesh health systems financing is characterized by high out-of-pocket payments (63.3%), which is increasing. Hence, we aimed to see who pays what part of this high out-of-pocket expenditure. To our knowledge, this was the first progressivity analysis of health systems financing in Bangladesh. We used data from Bangladesh Household Income and Expenditure Survey, 2010. This was a cross sectional and nationally representative sample of 12,240 households consisting of 55,580 individuals. For quantification of progressivity, we adopted the 'ability-to-pay' principle developed by O'Donnell, van Doorslaer, Wagstaff, and Lindelow (2008). We used the Kakwani index to measure the magnitude of progressivity. Health systems financing in Bangladesh is regressive. Inequality increases due to healthcare payments. The differences between the Gini coefficient and the Kakwani index for all sources of finance are negative, which indicates regressivity, and that financing is more concentrated among the poor. Income inequality increases due to high out-of-pocket payments. The increase in income inequality caused by out-of-pocket payments is 89% due to negative vertical effect and 11% due to horizontal inequity. Our findings add substantial evidence of health systems financing impact on inequitable financial burden of healthcare and income. The heavy reliance on out-of-pocket payments may affect household living standards. If the government and people of Bangladesh are concerned about equitable financing burden, our study suggests that Bangladesh needs to reform the health systems financing scheme.

  10. Progress in element analysis on a high-voltage electron microscope

    International Nuclear Information System (INIS)

    Tivol, W.F.; Barnard, D.; Guha, T.

    1985-01-01

    X-Ray microprobe (XMA) and electron energy-loss (EELS) spectrometers have been installed on the high-voltage electron microscope (HVEM). The probe size has been measured and background reduction is in progress for XMA and EELS as are improvements in electron optics for EELS and sensitivity measurements. XMA is currently useful for qualitative analysis and has been used by several investigators from our laboratory and outside laboratories. However, EELS background levels are still too high for meaningful results to be obtained. Standards suitable for biological specimens are being measured, and a library for quantitative analysis is being compiled

  11. A genetic analysis of Adh1 regulation. Progress report, June 1991--February 1992

    Energy Technology Data Exchange (ETDEWEB)

    Freeling, M.

    1992-03-01

    The overall goal of our research proposal is to understand the meaning of the various cis-acting sites responsible for AdH1 expression in the entire maize plant. Progress is reported in the following areas: Studies on the TATA box and analysis of revertants of the Adh1-3F1124 allele; screening for more different mutants that affect Adh1 expression differentially; studies on cis-acting sequences required for root-specific Adh1 expression; refinement of the use of the particle gun; and functional analysis of a non- glycolytic anaerobic protein.

  12. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  13. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled ``Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  14. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    International Nuclear Information System (INIS)

    Okrent, D.

    1989-01-01

    This final report summarizes the accomplishments of a two year research project entitled ''Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed

  15. Current landscape of protein glycosylation analysis and recent progress toward a novel paradigm of glycoscience research.

    Science.gov (United States)

    Yamamoto, Sachio; Kinoshita, Mitsuhiro; Suzuki, Shigeo

    2016-10-25

    This review covers the basics and some applications of methodologies for the analysis of glycoprotein glycans. Analytical techniques used for glycoprotein glycans, including liquid chromatography (LC), capillary electrophoresis (CE), mass spectrometry (MS), and high-throughput analytical methods based on microfluidics, were described to supply the essentials about biopharmaceutical and biomarker glycoproteins. We will also describe the MS analysis of glycoproteins and glycopeptides as well as the chemical and enzymatic releasing methods of glycans from glycoproteins and the chemical reactions used for the derivatization of glycans. We hope the techniques have accommodated most of the requests from glycoproteomics researchers. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Comparative Analysis of the Dark Ground Buffy Coat Technique (DG ...

    African Journals Online (AJOL)

    The prevalence of typanosome infection in 65 cattle reared under expensive system of management was determined using the dark ground buffy coat (DG) technique and the enzyme-linkedimmunisorbent assay (ELISA). The DG technique showed that there were 18 positive cases (27.69%) of total number of animals, made ...

  17. Efficiency analysis of orthokeratological correction in the treatment of progressive myopia in childhood

    Directory of Open Access Journals (Sweden)

    Dolgova Е.А.

    2017-06-01

    Full Text Available Objective: to evaluate the effectiveness of orthokeratological correction in the treatment of progressive myopia in children. Material and methods. A retrospective study correction of myopia orthokeratology lenses of 100 patients (178 eyes with an established diagnosis of myopia of an average degree, undergoing treatment at the Clinic of eye diseases of SSMU. Group I consisted of 50 patients who received OK-correction lenses "Emerald" by Euclid Systems Corporation (USA. The wear time is 2 years. Group II included 50 people (87 eyes, using spectacle correction during the same time. Patients underwent visual acuity testing, biomicroscopy, refractometry, determination of reserves of accommodation, ultrasonic biometry (IOL-Master, CarlZeiss is a measurement of the size of the front-sadayoshi eyeball, survey after 2 years of using the selected correction. Results. Comparative analysis showed that the use of OK lenses for patients with myopia of an average degree resulted in a significant reduction in the rate of progression of myopia. In the background wearing OK lenses for 2 years has been the increase PZO, PZO dynamics were 0.09±0.05 mm. For the correction of progressive myopia spectacle lens changes were more significant, the dynamics of early school leaving amounted to 0,36±0,11 mm. Conclusion. The identified inhibitory effect of OK-lenses on myopia progression, confirmed by the indices of refraction and ultrasound biometry within 2 years of the study. The data allow us to recommend OK-therapy, as an effective tool for progressive myopia.

  18. Agreement among graders on Heidelberg retina tomograph (HRT) topographic change analysis (TCA) glaucoma progression interpretation.

    Science.gov (United States)

    Iester, Michele M; Wollstein, Gadi; Bilonick, Richard A; Xu, Juan; Ishikawa, Hiroshi; Kagemann, Larry; Schuman, Joel S

    2015-04-01

    To evaluate agreement among experts of Heidelberg retina tomography's (HRT) topographic change analysis (TCA) printout interpretations of glaucoma progression and explore methods for improving agreement. 109 eyes of glaucoma, glaucoma suspect and healthy subjects with ≥5 visits and 2 good quality HRT scans acquired at each visit were enrolled. TCA printouts were graded as progression or non-progression. Each grader was presented with 2 sets of tests: a randomly selected single test from each visit and both tests from each visit. Furthermore, the TCA printouts were classified with grader's individual criteria and with predefined criteria (reproducible changes within the optic nerve head, disregarding changes along blood vessels or at steep rim locations and signs of image distortion). Agreement among graders was modelled using common latent factor measurement error structural equation models for ordinal data. Assessment of two scans per visit without using the predefined criteria reduced overall agreement, as indicated by a reduction in the slope, reflecting the correlation with the common factor, for all graders with no effect on reducing the range of the intercepts between the graders. Using the predefined criteria improved grader agreement, as indicated by the narrower range of intercepts among the graders compared with assessment using individual grader's criteria. A simple set of predefined common criteria improves agreement between graders in assessing TCA progression. The inclusion of additional scans from each visit does not improve the agreement. We, therefore, recommend setting standardised criteria for TCA progression evaluation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  19. Retrospective analysis of factors affecting the progression of Chronic Renal Failure in Adult Polycystic Kidney Disease

    International Nuclear Information System (INIS)

    Ahmed, E.R.; Tashkandi, Muhammed A.; Nahrir, S.; Maulana, A.

    2006-01-01

    Autosomal dominant polycystic kidney disease (ADPKD) is the commonest congenital cystic renal disease. Factors such as hypertension, urinary tract infection, hematuria and proteinuria may effect the progression to chronic renal failure in ADPKD patients. Therapeutic interventions, such as the use of angiotensin converting enzyme inhibitors (ACEI) or diet modification, may impact the natural progression of the disease. We aim in this study to review a registry of ADPKD patients in order to compare the slow and fast progressors and identify possible predictors of progression and interventions that slow the progression of this disease. Sheffield Kidney Institute (SKI), one of the largest kidney institutes in Northern Europe, has registered a large number of ADPKD patients since 1981. SKI's computer network contains a wide range of information on these patients. We selected 94 adult polycystic patients from the SKI for retrospective analysis of factors affecting progression to chronic renal failure. Patients who doubled their s. creatinine in 3 6 months were considered fast progressors (FP), while those who doubled their s. creatinine in > 36 months were regarded as slow progressors (SP). There 70 patients in the FP group and 24 patients in the SP group. A third group of 137 patients consisted of non-progressors (NP) who ha d stable s. creatinine levels during the same period. We found that the incidence of hypertension, UTI, macroscopic and microscopic hematuria, and overt proteinuria in the FP group was higher than in SP and NP groups. Modification of some factors, such as hypertension and UTI, may decrease the rate of the deterioration of renal function. (author)

  20. Comparative analysis of station blackout accident progression in typical PWR, BWR, and PHWR

    International Nuclear Information System (INIS)

    Park, Soo Young; Ahn, Kwang Il

    2012-01-01

    Since the crisis at the Fukushima plants, severe accident progression during a station blackout accident in nuclear power plants is recognized as a very important area for accident management and emergency planning. The purpose of this study is to investigate the comparative characteristics of anticipated severe accident progression among the three typical types of nuclear reactors. A station blackout scenario, where all off-site power is lost and the diesel generators fail, is simulated as an initiating event of a severe accident sequence. In this study a comparative analysis was performed for typical pressurized water reactor (PWR), boiling water reactor (BWR), and pressurized heavy water reactor (PHWR). The study includes the summarization of design differences that would impact severe accident progressions, thermal hydraulic/severe accident phenomenological analysis during a station blackout initiated-severe accident; and an investigation of the core damage process, both within the reactor vessel before it fails and in the containment afterwards, and the resultant impact on the containment.

  1. A joint frailty-copula model between tumour progression and death for meta-analysis.

    Science.gov (United States)

    Emura, Takeshi; Nakatochi, Masahiro; Murotani, Kenta; Rondeau, Virginie

    2017-12-01

    Dependent censoring often arises in biomedical studies when time to tumour progression (e.g., relapse of cancer) is censored by an informative terminal event (e.g., death). For meta-analysis combining existing studies, a joint survival model between tumour progression and death has been considered under semicompeting risks, which induces dependence through the study-specific frailty. Our paper here utilizes copulas to generalize the joint frailty model by introducing additional source of dependence arising from intra-subject association between tumour progression and death. The practical value of the new model is particularly evident for meta-analyses in which only a few covariates are consistently measured across studies and hence there exist residual dependence. The covariate effects are formulated through the Cox proportional hazards model, and the baseline hazards are nonparametrically modeled on a basis of splines. The estimator is then obtained by maximizing a penalized log-likelihood function. We also show that the present methodologies are easily modified for the competing risks or recurrent event data, and are generalized to accommodate left-truncation. Simulations are performed to examine the performance of the proposed estimator. The method is applied to a meta-analysis for assessing a recently suggested biomarker CXCL12 for survival in ovarian cancer patients. We implement our proposed methods in R joint.Cox package.

  2. A novel preconcentration technique for the PIXE analysis of water

    Energy Technology Data Exchange (ETDEWEB)

    Savage, J.M. [Element Analysis Corp., Lexington, KY (United States); Fernandez, R.F. [Element Analysis Corp., Lexington, KY (United States); Zhang, W. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States); Robertson, J.D. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States); Majidi, V. [Department of Chemistry, University of Kentucky, Lexington, KY 40506-0055 (United States)

    1995-05-01

    The potential of using dried algae as a novel preconcentration technique for the analysis of water samples by PIXE was examined. The algae cells were found to contain significant levels of P and S, indicative of phosphorous- and sulfur-containing groups on the cell wall or inside the algae cells which may serve as potential binding sites for metal ions. When C. vulgaris was used on mixed metal solutions, linear responses were observed for Ag{sup +}, Ba{sup 2+}, and Cd{sup 2+} in the concentration range from 10 ng/g to 1 {mu}g/g; for Cu{sup 2+} and Pb{sup 2+} from 10 ng/g to 5 {mu}g/g; and for Hg{sup 2+} from 10 ng/g to 10 {mu}g/g. When S. bacillaris was used, linear responses were observed from 10 ng/g up to 10 {mu}g/g for all of the metal cations investigated. The PIXE results demonstrated that metal binding at low concentrations involves replacement of sodium on the cell wall and that at high concentrations magnesium was also replaced. Competitive binding studies indicate that the metal ions, Ag{sup +}, Ba{sup 2+}, Cd{sup 2+}, Cu{sup 2+}, and Pb{sup 2+}, share common binding sites with binding efficiencies varying in the sequence of Pb{sup 2+}>Cu{sup 2+}>Ag{sup 2+}>Cd{sup 2+}>Ba{sup 2+}. The binding of Hg{sup 2+} involved a different binding site with an increase in binding efficiency in the presence of Ag{sup +}. (orig.).

  3. A study of atriphos (ATP) action on muscular circulation in progressive muscular dystrophy by the radioactive xenon clearance technique

    International Nuclear Information System (INIS)

    Chakyrov, B.; Samardzhiev, A.

    1977-01-01

    The effect of intramuscularly and intravenously adminostered atriphos on the muscular circulation was studied with radioactive xenon in 12 children with progressive muscular dystrophy. After combined local intramuscular injection of ATP (atriphos) with the radioactive marker a 12-fold increment of muscular circulation ensues, lasting about 15 minutes. No vasodilatating effect on the muscular flow was oberved after intravenous injection of 20-40 mg of atriphos. It is believed that intramuscular administration of atriphos produced dilatation of capillaries and of the venous part of the muscular circulation. (author)

  4. Progress in ETA-II magnetic field alignment using stretched wire and low energy electron beam techniques

    International Nuclear Information System (INIS)

    Griffith, L.V.; Deadrick, F.J.

    1991-01-01

    Flux line alignment of the solenoidal focus magnets used on the ETA-II linear induction accelerator is a key element leading to a reduction of beam corkscrew motion. Two techniques have been used on the ETA-II accelerator to measure and establish magnet alignment. A low energy electron beam has been used to directly map magnetic field lines, and recent work has utilized a pulsed stretched wire technique to measure magnet tilts and offsets with respect to a reference axis. This paper reports on the techniques used in the ETA-II accelerator alignment, and presents results from those measurements which show that accelerator is magnetically aligned to within ∼ ± 200 microns

  5. Conformational Analysis of Misfolded Protein Aggregation by FRET and Live-Cell Imaging Techniques

    Directory of Open Access Journals (Sweden)

    Akira Kitamura

    2015-03-01

    Full Text Available Cellular homeostasis is maintained by several types of protein machinery, including molecular chaperones and proteolysis systems. Dysregulation of the proteome disrupts homeostasis in cells, tissues, and the organism as a whole, and has been hypothesized to cause neurodegenerative disorders, including amyotrophic lateral sclerosis (ALS and Huntington’s disease (HD. A hallmark of neurodegenerative disorders is formation of ubiquitin-positive inclusion bodies in neurons, suggesting that the aggregation process of misfolded proteins changes during disease progression. Hence, high-throughput determination of soluble oligomers during the aggregation process, as well as the conformation of sequestered proteins in inclusion bodies, is essential for elucidation of physiological regulation mechanism and drug discovery in this field. To elucidate the interaction, accumulation, and conformation of aggregation-prone proteins, in situ spectroscopic imaging techniques, such as Förster/fluorescence resonance energy transfer (FRET, fluorescence correlation spectroscopy (FCS, and bimolecular fluorescence complementation (BiFC have been employed. Here, we summarize recent reports in which these techniques were applied to the analysis of aggregation-prone proteins (in particular their dimerization, interactions, and conformational changes, and describe several fluorescent indicators used for real-time observation of physiological states related to proteostasis.

  6. Short analysis of a progressive distorsion problem (tension and cyclic torsion)

    International Nuclear Information System (INIS)

    Roche, Roland.

    1978-06-01

    Tests on ratcheting (or progressive distorsion) are in progress in Saclay. A thin tube is subjected to a constant tensile load and to a cyclic twist. The present paper is a short theoretial analysis of that case. A uniform strain and stress field is considered with a constant tensile stress P (primary stress) and a cyclic shearing strain. The shearing strain is known by the corresponding elastic equivalent stress intensity (TRESCA criterion). The cyclic range of the stress intensity is ΔQ (secondary stress range). Are examined the shake down condition and the incremental elongations with different constitutive equations of the material. Special attention is given to perfect plasticity and bilinear kinematic hardening results are presented, but it is believed that these materials mathematical models are simplistic and special experimental tests are proposed [fr

  7. Application status of on-line nuclear techniques in analysis of coal quality

    International Nuclear Information System (INIS)

    Cai Shaohui

    1993-01-01

    Nuclear techniques are favourable for continuous on-line analysis, because they are fast, non-intrusive. They can be used in the adverse circumstances in coal industry. The paper reviews the application status of on-line nuclear techniques in analysis of coal quality and economic benefits derived from such techniques in developed countries

  8. Rapid analysis of steels using laser-based techniques

    International Nuclear Information System (INIS)

    Cremers, D.A.; Archuleta, F.L.; Dilworth, H.C.

    1985-01-01

    Based on the data obtained by this study, we conclude that laser-based techniques can be used to provide at least semi-quantitative information about the elemental composition of molten steel. Of the two techniques investigated here, the Sample-Only method appears preferable to the LIBS (laser-induced breakdown spectroscopy) method because of its superior analytical performance. In addition, the Sample-Only method would probably be easier to incorporate into a steel plant environment. However, before either technique can be applied to steel monitoring, additional research is needed

  9. Analysis of Piezoelectric Structural Sensors with Emergent Computing Techniques

    Science.gov (United States)

    Ramers, Douglas L.

    2005-01-01

    pressurizing the bottle on a test stand, and running sweeps of excitations frequencies for each of the piezo sensors and recording the resulting impedance. The sweeps were limited to 401 points by the available analyzer, and it was decided to perform individual sweeps at five different excitation frequency ranges. The frequency ranges used for the PZTs were different in two of the five ranges from the ranges used for the SCP. The bottles were pressurized to empty (no water), 0psig, 77 psig, 155 psig, 227 psig in nearly uniform increments of about 77psi. One of each of the two types of piezo sensors was fastened on to the bottle surface at two locations: about midway between the ends on cylindrical portion of the bottle and at the very edge of one of the end domes. The data was collected in files by sensor type (2 cases), by location (2 cases), by frequency range (5 cases), and pressure (5cases) to produce 100 data sets of 401 impedances. After familiarization with the piezo sensing technology and obtaining the data, the team developed a set of questions to try to answer regarding the data and made assignments of responsibilities. The next section lists the questions, and the remainder of the report describes the data analysis work performed by Dr. Ramers. This includes a discussion of the data, the approach to answering the question using statistical techniques, the use of an emergent system to investigate the data where statistical techniques were not usable, conclusions regarding the data, and recommendations.

  10. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  11. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  12. A Comparative Analysis of Machine Learning Techniques for Credit Scoring

    OpenAIRE

    Nwulu, Nnamdi; Oroja, Shola; İlkan, Mustafa

    2012-01-01

    Abstract Credit Scoring has become an oft researched topic in light of the increasing volatility of the global economy and the recent world financial crisis. Amidst the many methods used for credit scoring, machine learning techniques are becoming increasingly popular due to their efficient and accurate nature and relative simplicity. Furthermore machine learning techniques minimize the risk of human bias and error and maximize speed as they are able to perform computation...

  13. Genome sequencing and analysis conferences. Progress report, August 15, 1993--August 15, 1994

    Energy Technology Data Exchange (ETDEWEB)

    Venter, J.C.

    1995-10-01

    The 14 plenary session presentations focused on nematode; yeast; fruit fly; plants; mycobacteria; and man. In addition there were presentations on a variety of technical innovations including database developments and refinements, bioelectronic genesensors, computer-assisted multiplex techniques, and hybridization analysis with DNA chip technology. This document includes only the session schedule.

  14. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  15. Nucleic acid hybridization and radioimmunoassay techniques for studying the interrelationships among the progressive pneumonia viruses of sheep

    International Nuclear Information System (INIS)

    Weiss, M.J.

    1976-01-01

    In Section I of this thesis, experiments were performed to determine if three representative ''slow'' viruses of sheep VV, MV and PPV replicate by way of a DNA ''provirus'' in a manner similar to the RNA tumor viruses. The approach used was to determine if unique virus-specific DNA sequences not present in normal cells could be detected in the DNA of infected cell cultures. The results presented demonstrate that infection by VV, MV and PPV results in the synthesis of proviral DNA. Sections II and III examine the similarities among VV, MV and PPV. In Section II, the RNA genomes of these viruses were compared by nucleic acid hybridization. The homology among these viral RNAs was determined from the extensive competition of homologous viral RNA-cDNA hybrids by heterologous RNA and from the thermal stability of homologous and heterologous RNA-cDNA hybrids. The 70S RNAs of visna and maedi virus were indistinguishable but only partially homologous to that of progressive pneumonia virus. Section III describes the purification of the major internal protein component of VV, p27, the development of a radioimmunoassay to study its antigenic relatedness to the corresponding proteins of PPV and MV, and its use in the detection of cross-reacting proteins in progressive pneumonia virus infected sheep lung. The ability to detect unique virus-related DNA sequences and viral antigens in infected sheep tissues makes it now feasible to search for slow virus related DNA sequences and/or antigens in human diseases which bear resemblance to the slow diseases of sheep

  16. Recent progress in methods for non-invasive measurements of local strain in practical superconducting wires and conductors using quantum beam techniques

    International Nuclear Information System (INIS)

    Osamura, Kozo; Machiya, Shutaro; Tsuchiya, Yoshinori; Suzuki, Hiroshi; Awaji, Satoshi; Takahashi, Kohki; Oguro, Hidetoshi; Harjo, Stefanus; Hemmi, Tsutomu; Nakamoto, Tatsushi; Sugano, Michinaka; Jin, Xinzhe; Kajiwara, Kentaro

    2014-01-01

    Practical superconducting wires are designed with a composite structure to meet the desired engineering characteristics by expert selection of materials and design of the architecture. In practice, the local strain exerted on the superconducting component influences the electromagnetic properties. Here, recent progress in methods used to measure the local strain in practical superconducting wires and conductors using quantum beam techniques is introduced. Recent topics on the strain dependence of critical current are reviewed for three major practical wires: ITER-Nb 3 Sn strand, DI-BSCCO wires and REBCO tapes. (author)

  17. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  18. Progress of a Cross-Correlation Based Optical Strain Measurement Technique for Detecting Radial Growth on a Rotating Disk

    Science.gov (United States)

    Clem, Michelle M.; Woike, Mark R.; Abdul-Aziz, Ali

    2014-01-01

    The Aeronautical Sciences Project under NASA's Fundamental Aeronautics Program is interested in the development of novel measurement technologies, such as optical surface measurements for the in situ health monitoring of critical constituents of the internal flow path. In situ health monitoring has the potential to detect flaws, i.e. cracks in key components, such as engine turbine disks, before the flaws lead to catastrophic failure. The present study, aims to further validate and develop an optical strain measurement technique to measure the radial growth and strain field of an already cracked disk, mimicking the geometry of a sub-scale turbine engine disk, under loaded conditions in the NASA Glenn Research Center's High Precision Rotordynamics Laboratory. The technique offers potential fault detection by imaging an applied high-contrast random speckle pattern under unloaded and loaded conditions with a CCD camera. Spinning the cracked disk at high speeds (loaded conditions) induces an external load, resulting in a radial growth of the disk of approximately 50.0-µm in the flawed region and hence, a localized strain field. When imaging the cracked disk under static conditions, the disk will be undistorted; however, during rotation the cracked region will grow radially, thus causing the applied particle pattern to be 'shifted'. The resulting particle displacements between the two images is measured using the two-dimensional cross-correlation algorithms implemented in standard Particle Image Velocimetry (PIV) software to track the disk growth, which facilitates calculation of the localized strain field. A random particle distribution is adhered onto the surface of the cracked disk and two bench top experiments are carried out to evaluate the technique's ability to measure the induced particle displacements. The disk is shifted manually using a translation stage equipped with a fine micrometer and a hotplate is used to induce thermal growth of the disk, causing the

  19. Molecularly imprinted polymers for sample preparation and biosensing in food analysis: Progress and perspectives

    DEFF Research Database (Denmark)

    Ashley, Jon; Shahbazi, Mohammad-Ali; Kant, Krishna

    2017-01-01

    Molecularly imprinted polymers (MIPs) are biomimetics which can selectively bind to analytes of interest. One of the most interesting areas where MIPs have shown the biggest potential is food analysis. MIPs have found use as sorbents in sample preparation attributed to the high selectivity and high...... the imprinting methods which are applicable for imprinting food templates, summarize the recent progress in using MIPs for preparing and analysing food samples, and discuss the current limitations in the commercialisation of MIPs technology. Finally, future perspectives will be given....

  20. International research progress of CFD application in analysis of nuclear power system

    International Nuclear Information System (INIS)

    Li Linsen; Wang Kan; Song Xiaoming

    2009-01-01

    This paper introduces the latest international research progress of CFD application in nuclear reactor system analysis. CFD method has been applied to a few 3-D single phase transient simulations, including flow field modeling of the reactor cores, assemblies, and vessel plenums. On the other hand, CFD method applied to reactor system still needs further validation and benchmarking, meanwhile,the application of CFD also needs to be studied, including the setup of the Best Practice Guidelines (BPG). Furthermore, CFD codes are used to couple with thermal-hydraulic system codes or neutronic codes. Eventually, in two phase field and turbulence modeling, CFD codes are still being developed. (authors)

  1. Long Term Results of Visual Field Progression Analysis in Open Angle Glaucoma Patients Under Treatment.

    Science.gov (United States)

    Kocatürk, Tolga; Bekmez, Sinan; Katrancı, Merve; Çakmak, Harun; Dayanır, Volkan

    2015-01-01

    To evaluate visual field progression with trend and event analysis in open angle glaucoma patients under treatment. Fifteen year follow-up results of 408 eyes of 217 glaucoma patients who were followed at Adnan Menderes University, Department of Ophthalmology between 1998 and 2013 were analyzed retrospectively. Visual field data were collected for Mean Deviation (MD), Visual Field Index (VFI), and event occurrence. There were 146 primary open-angle glaucoma (POAG), 123 pseudoexfoliative glaucoma (XFG) and 139 normal tension glaucoma (NTG) eyes. MD showed significant change in all diagnostic groups (pfield indices. We herein report our fifteen year follow-up results in open angle glaucoma.

  2. Evaluation of explicit finite-difference techniques for LMFBR safety analysis

    International Nuclear Information System (INIS)

    Bernstein, D.; Golden, R.D.; Gross, M.B.; Hofmann, R.

    1976-01-01

    In the past few years, the use of explicit finite-difference (EFD) and finite-element computer programs for reactor safety calculations has steadily increased. One of the major areas of application has been for the analysis of hypothetical core disruptive accidents in liquid metal fast breeder reactors. Most of these EFD codes were derived to varying degrees from the same roots, but the codes are large and have progressed rapidly, so there may be substantial differences among them in spite of a common ancestry. When this fact is coupled with the complexity of HCDA calculations, it is not possible to assure that independent calculations of an HCDA will produce substantially the same results. Given the extreme importance of nuclear safety, it is essential to be sure that HCDA analyses are correct, and additional code validation is therefore desirable. A comparative evaluation of HCDA computational techniques is being performed under an ERDA-sponsored program called APRICOT (Analysis of PRImary COntainment Transients). The philosophy, calculations, and preliminary results from this program are described in this paper

  3. Radon remedial techniques in buildings - analysis of French actual cases

    International Nuclear Information System (INIS)

    Dupuis, M.

    2004-01-01

    The IRSN has compiled a collection of solutions from data provided by the various decentralised government services in 31 French departments. Contributors were asked to provide a description of the building, as well as details of measured radon levels, the type of reduction technique adopted and the cost. Illustrative layouts, technical drawings and photographs were also requested, when available. Of the cases recorded, 85% are establishments open to the public (schools (70%), city halls (4%) and combined city halls and school houses (26%)), 11% are houses and 4% industrial buildings. IRSN obtained 27 real cases of remedial techniques used. The data were presented in the form of fact sheets. The primary aim of this exercise was to illustrate each of the radon reduction techniques that can be used in the different building types (with basement, ground bearing slab, crawl space). This investigation not only enabled us to show that combining passive and active techniques reduces the operating cost of the installation, but above all that it considerably improves the efficiency. The passive technique reduces the amount of radon in the building and thus reduces the necessary ventilation rate, which directly affects the cost of operating the installation. For the 27 cases recorded, we noted:(a) the application of 7 passive techniques: sealing of floors and semi-buried walls, together with improved aeration by installing ventilation openings or ventilation strips in the windows. Radon concentrations were reduced on average by a factor of 4.7. No measurement in excess of 400 Bq.m -3 (the limit recommended by the French public authorities) was obtained following completion of the works; (b) the application of 15 active techniques: depressurization of the underlying ground, crawl space or basement and/or pressurization of the building. Radon concentrations were reduced on average by a factor of 13.8. Radon concentrations of over 400 Bq.m -3 were measured in only 4 cases

  4. Glaucoma progression detection by retinal nerve fiber layer measurement using scanning laser polarimetry: event and trend analysis.

    Science.gov (United States)

    Moon, Byung Gil; Sung, Kyung Rim; Cho, Jung Woo; Kang, Sung Yong; Yun, Sung-Cheol; Na, Jung Hwa; Lee, Youngrok; Kook, Michael S

    2012-06-01

    To evaluate the use of scanning laser polarimetry (SLP, GDx VCC) to measure the retinal nerve fiber layer (RNFL) thickness in order to evaluate the progression of glaucoma. Test-retest measurement variability was determined in 47 glaucomatous eyes. One eye each from 152 glaucomatous patients with at least 4 years of follow-up was enrolled. Visual field (VF) loss progression was determined by both event analysis (EA, Humphrey guided progression analysis) and trend analysis (TA, linear regression analysis of the visual field index). SLP progression was defined as a reduction of RNFL exceeding the predetermined repeatability coefficient in three consecutive exams, as compared to the baseline measure (EA). The slope of RNFL thickness change over time was determined by linear regression analysis (TA). Twenty-two eyes (14.5%) progressed according to the VF EA, 16 (10.5%) by VF TA, 37 (24.3%) by SLP EA and 19 (12.5%) by SLP TA. Agreement between VF and SLP progression was poor in both EA and TA (VF EA vs. SLP EA, k = 0.110; VF TA vs. SLP TA, k = 0.129). The mean (±standard deviation) progression rate of RNFL thickness as measured by SLP TA did not significantly differ between VF EA progressors and non-progressors (-0.224 ± 0.148 µm/yr vs. -0.218 ± 0.151 µm/yr, p = 0.874). SLP TA and EA showed similar levels of sensitivity when VF progression was considered as the reference standard. RNFL thickness as measurement by SLP was shown to be capable of detecting glaucoma progression. Both EA and TA of SLP showed poor agreement with VF outcomes in detecting glaucoma progression.

  5. Improved streaming analysis technique: spherical harmonics expansion of albedo data

    International Nuclear Information System (INIS)

    Albert, T.E.; Simmons, G.L.

    1979-01-01

    An improved albedo scattering technique was implemented with a three-dimensional Monte Carlo transport code for use in analyzing radiation streaming problems. The improvement was based on a shifted spherical Harmonics expansion of the doubly differential albedo data base. The result of the improvement was a factor of 3 to 10 reduction in data storage requirements and approximately a factor of 3 to 6 increase in computational speed. Comparisons of results obtained using the technique with measurements are shown for neutron streaming in one- and two-legged square concrete ducts

  6. Undesirable effects of covariance matrix techniques for error analysis

    International Nuclear Information System (INIS)

    Seibert, D.

    1994-01-01

    Regression with χ 2 constructed from covariance matrices should not be used for some combinations of covariance matrices and fitting functions. Using the technique for unsuitable combinations can amplify systematic errors. This amplification is uncontrolled, and can produce arbitrarily inaccurate results that might not be ruled out by a χ 2 test. In addition, this technique can give incorrect (artificially small) errors for fit parameters. I give a test for this instability and a more robust (but computationally more intensive) method for fitting correlated data

  7. Current trends in nuclear borehole logging techniques for elemental analysis

    International Nuclear Information System (INIS)

    1988-06-01

    This report is the result of a consultants' meeting organized by the IAEA and held in Ottawa, Canada, 2-6 November 1987 in order to assess the present technical status of nuclear borehole logging techniques, to find out the well established applications and the development trends. It contains a summary report giving a comprehensive overview of the techniques and applications and a collection of research papers describing work done in industrial institutes. A separate abstract was prepared for each of these 9 papers. Refs, figs and tabs

  8. A review on applications of the wavelet transform techniques in spectral analysis

    International Nuclear Information System (INIS)

    Medhat, M.E.; Albdel-hafiez, A.; Hassan, M.F.; Ali, M.A.; Awaad, Z.

    2004-01-01

    Starting from 1989, a new technique known as wavelet transforms (WT) has been applied successfully for analysis of different types of spectra. WT offers certain advantages over Fourier transforms for analysis of signals. A review of using this technique through different fields of elemental analysis is presented

  9. Determining the Number of Factors in P-Technique Factor Analysis

    Science.gov (United States)

    Lo, Lawrence L.; Molenaar, Peter C. M.; Rovine, Michael

    2017-01-01

    Determining the number of factors is a critical first step in exploratory factor analysis. Although various criteria and methods for determining the number of factors have been evaluated in the usual between-subjects R-technique factor analysis, there is still question of how these methods perform in within-subjects P-technique factor analysis. A…

  10. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  11. A quantitative analysis of rotary, ultrasonic and manual techniques to treat proximally flattened root canals

    Directory of Open Access Journals (Sweden)

    Fabiana Soares Grecca

    2007-04-01

    Full Text Available OBJECTIVE: The efficiency of rotary, manual and ultrasonic root canal instrumentation techniques was investigated in proximally flattened root canals. MATERIAL AND METHODS: Forty human mandibular left and right central incisors, lateral incisors and premolars were used. The pulp tissue was removed and the root canals were filled with red die. Teeth were instrumented using three techniques: (i K3 and ProTaper rotary systems; (ii ultrasonic crown-down technique; and (iii progressive manual technique. Roots were bisected longitudinally in a buccolingual direction. The instrumented canal walls were digitally captured and the images obtained were analyzed using the Sigma Scan software. Canal walls were evaluated for total canal wall area versus non-instrumented area on which dye remained. RESULTS: No statistically significant difference was found between the instrumentation techniques studied (p<0.05. CONCLUSION: The findings of this study showed that no instrumentation technique was 100% efficient to remove the dye.

  12. Potential use of transmission tomographic techniques for the quality checking of cemented waste drums. Progress report to 31 March 1985

    Energy Technology Data Exchange (ETDEWEB)

    Huddleston, J; Hutchinson, I G

    1986-01-01

    In support of the programme for the quality checking of encapsulated intermediate level waste, the possibilities of using transmission tomographic techniques for the determination of the physical properties of the drum are being considered. A literature survey has been undertaken and the possibilities of extracting data from video recordings of real time radiographs are considered. This work was carried out with financial support from British Nuclear Fuels plc and the UK Department of the Environment. In the DoE context, the results will be used in the formulation of Government Policy, but at this stage they do not necessarily represent Government Policy.

  13. Research review and development trends of human reliability analysis techniques

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Dai Licao

    2011-01-01

    Human reliability analysis (HRA) methods are reviewed. The theoretical basis of human reliability analysis, human error mechanism, the key elements of HRA methods as well as the existing HRA methods are respectively introduced and assessed. Their shortcomings,the current research hotspot and difficult problems are identified. Finally, it takes a close look at the trends of human reliability analysis methods. (authors)

  14. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  15. Experimental Analysis of Temperature Differences During Implant Site Preparation: Continuous Drilling Technique Versus Intermittent Drilling Technique.

    Science.gov (United States)

    Di Fiore, Adolfo; Sivolella, Stefano; Stocco, Elena; Favero, Vittorio; Stellini, Edoardo

    2018-02-01

    Implant site preparation through drilling procedures may cause bone thermonecrosis. The aim of this in vitro study was to evaluate, using a thermal probe, overheating at implant sites during osteotomies through 2 different drilling methods (continuous drilling technique versus intermittent drilling technique) using irrigation at different temperatures. Five implant sites 13 mm in length were performed on 16 blocks (fresh bovine ribs), for a total of 80 implant sites. The PT-100 thermal probe was positioned 5 mm from each site. Two physiological refrigerant solutions were used: one at 23.7°C and one at 6.0°C. Four experimental groups were considered: group A (continuous drilling with physiological solution at 23.7°C), group B (intermittent drilling with physiological solution at 23.7°C), group C (continuous drilling with physiological solution at 6.0°C), and group D (intermittent drilling with physiological solution at 6.0°C). The Wilcoxon rank-sum test (2-tailed) was used to compare groups. While there was no difference between group A and group B (W = 86; P = .45), statistically significant differences were observed between experimental groups A and C (W = 0; P =.0001), B and D (W = 45; P =.0005), and C and D (W = 41; P = .003). Implant site preparation did not affect the overheating of the bone. Statistically significant differences were found with the refrigerant solutions. Using both irrigating solutions, bone temperature did not exceed 47°C.

  16. The use of crypto-analysis techniques for securing internet ...

    African Journals Online (AJOL)

    ... recommended to be combined with other techniques, such as client-side software, data transaction protocols, web server software, and the network server operating system involved in handling e-commerce, for securing internet transaction. This recommendation will invariable ensure that internet transaction is secured.

  17. Protease analysis by zymography: a review on techniques and patents.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2009-01-01

    Zymography, the detection of enzymatic activity on gel electrophoresis, has been a technique described in the literature for at least in the past 50 years. Although a diverse amount of enzymes, especially proteases, have been detected, advances and improvements have been slower in comparison with other molecular biology, biotechnology and chromatography techniques. Most of the reviews and patents published focus on the technique as an element for enzymatic testing, but detailed analytical studies are scarce. Patents referring to zymography per se are few and the technique itself is hardly an important issue in titles or keywords in many scientific publications. This review covers a small condensation of the works published so far dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations like 2-D zymography, real-time zymography, and in-situ zymography. Moreover, a scope will be given to visualize the new tendencies of this method, regarding substrates used and activity visualization. What to expect from zymography in the near future is also approached.

  18. Dynamic Analysis Techniques for the Reconstruction of Architectural Views

    NARCIS (Netherlands)

    Cornelissen, B.

    2007-01-01

    Gaining an understanding of software systems is an important discipline in many software engineering contexts. It is essential that software engineers are assisted as much as possible during this task, e.g., by using tools and techniques that provide architectural views on the software at hand. This

  19. Analysis of ISO 26262 Compliant Techniques for the Automotive Domain

    NARCIS (Netherlands)

    M. S. Kannan; Y. Dajsuren (Yanjindulam); Y. Luo; I. Barosan

    2015-01-01

    htmlabstractThe ISO 26262 standard denes functional safety for automotive E/E systems. Since the publication of the rst edition of this standard in 2011, many dierent safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the

  20. Analysis of ISO 26262 compliant techniques for the automotive domain

    NARCIS (Netherlands)

    S., Manoj Kannan; Dajsuren, Y.; Luo, Y.; Barosan, I.; Antkiewicz, M.; Atlee, J.; Dingel, J.; S, R.

    2015-01-01

    The ISO 26262 standard defines functional safety for automotive E/E systems. Since the publication of the first edition of this standard in 2011, many different safety techniques complying to the ISO 26262 have been developed. However, it is not clear which parts and (sub-) phases of the standard

  1. Progress in study of Prespa Lake using nuclear and related techniques (IAEA Regional Project RER/8/008)

    International Nuclear Information System (INIS)

    Anovski, Todor

    2001-09-01

    One of the main objective of the IAEA - Regional project RER/8/008 entitled Study of Prespa Lake Using Nuclear and Related Techniques was to provide a scientific basis for sustainable and environmental management of the Lake Prespa (Three lakes: Ohrid, Big Prespa and Small Prespa are on the borders between Albania, Republic of Macedonia and Greece, and are separated by the Mali i Thate and Galichica, mostly Carstificated mountains), see Fig. 1. In this sense investigations connected with the hydrogeology, water quality (Physics-chemical, biological and radiological characteristics) and water balance determination by application of Environmental isotopes ( i.e. H,D,T,O-18,O-18 etc.,) distribution, artificial water tracers and other relevant analytical techniques such as: AAS, HPLC, Total α and β-activity, α and γ-spectrometry as well as ultra sonic measurements (defining of the Lake bottom profile) through regional cooperation / Scientists from Albania, Greece and Republic of Macedonia, participated in the implementation of the Project/ during one hydrological year, had been initiated and valuable results obtained, a part of which are presented in this report. This cooperation was the only way for providing necessary data for better understanding beside the other, of the water quality of the Prespa Lake and its hydrological relationship to Ohrid Lake too, representing a unique regional hydro system in the world. (Author)

  2. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  3. VLBI FOR GRAVITY PROBE B. IV. A NEW ASTROMETRIC ANALYSIS TECHNIQUE AND A COMPARISON WITH RESULTS FROM OTHER TECHNIQUES

    International Nuclear Information System (INIS)

    Lebach, D. E.; Ratner, M. I.; Shapiro, I. I.; Bartel, N.; Bietenholz, M. F.; Lederman, J. I.; Ransom, R. R.; Campbell, R. M.; Gordon, D.; Lestrade, J.-F.

    2012-01-01

    When very long baseline interferometry (VLBI) observations are used to determine the position or motion of a radio source relative to reference sources nearby on the sky, the astrometric information is usually obtained via (1) phase-referenced maps or (2) parametric model fits to measured fringe phases or multiband delays. In this paper, we describe a 'merged' analysis technique which combines some of the most important advantages of these other two approaches. In particular, our merged technique combines the superior model-correction capabilities of parametric model fits with the ability of phase-referenced maps to yield astrometric measurements of sources that are too weak to be used in parametric model fits. We compare the results from this merged technique with the results from phase-referenced maps and from parametric model fits in the analysis of astrometric VLBI observations of the radio-bright star IM Pegasi (HR 8703) and the radio source B2252+172 nearby on the sky. In these studies we use central-core components of radio sources 3C 454.3 and B2250+194 as our positional references. We obtain astrometric results for IM Peg with our merged technique even when the source is too weak to be used in parametric model fits, and we find that our merged technique yields astrometric results superior to the phase-referenced mapping technique. We used our merged technique to estimate the proper motion and other astrometric parameters of IM Peg in support of the NASA/Stanford Gravity Probe B mission.

  4. Potential application of microfocus X-ray techniques for quantitative analysis of bone structure

    International Nuclear Information System (INIS)

    Takahashi, Kenta

    2006-01-01

    With the progress of micro-focused X-ray computed tomography (micro-CT), it has become possible to evaluate the bone structure quantitatively and three-dimensionally. The advantages of micro-CT are that sample preparations are not required and that it provides not only two-dimensional parameters but also three-dimensional stereological indices. This study was carried out to evaluate the potential application of the micro-focus X-ray techniques for quantitative analysis of the new bone produced inside of a hollow chamber of the experimental titanium miniature implant. Twenty-five male wistar rats (9-weeks of age) received experimental titanium miniature implant that had a hollow chamber inside in the left side of the femur. The rats were sacrificed, then the femurs were excised at 4 weeks or 8 weeks after implantation. Micro-CT analysis was performed on the femur samples and the volume of the new bone induced in the hollow chamber of implant was calculated. Percentages of new bone area on the undecalcified histological slides were also measured, linear regression analysis was carried out. In order to evaluate the correlation between pixel numbers of undecalcified slide specimen and pixel numbers of micro-CT image. New bone formation occurred in experimental titanium miniature implant with a hollow chamber. The volume of new bone was measured by micro CT and the area percentage of new bone area against hollow chamber was calculated on the undecalcified slide. Linear regression analysis showed a high correlation between the pixel numbers of undecalcified slide specimen and pixel numbers of micro-CT image. Consequently, the new bone produced inside of the hollow chamber of the experimental titanium miniature implant could be quantified as three-dimensional stereological by micro-CT and its precision was supported by the high correlation between the measurement by micro-CT and conservative two-dimensional measurement of histological slide. (author)

  5. [Research progress and application prospect of near infrared spectroscopy in soil nutrition analysis].

    Science.gov (United States)

    Ding, Hai-quan; Lu, Qi-peng

    2012-01-01

    "Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.

  6. DEVELOPMENT OF TECHNIQUES FOR QUANTITATIVE ANALYSIS OF LIME FLOWERS

    Directory of Open Access Journals (Sweden)

    Demyanenko DV

    2016-03-01

    Full Text Available Introduction. The article is devoted to the development of techniques for quantitative analysis of lime flower in order to make amendments to existing pharmacopoeian monographs for this herbal drug. Lime inflorescences contain lipophilic biologically active substances (BAS causing notable antimicrobial and anti-inflammatory effects and also more polar phenolic compounds with antiulcer activity. Considering this, it’s necessary to regulate all these groups of BAS quantitatively. Materials and methods. For this study six batches of lime flowers harvested in 2008-2009 yrs. in Kharkiv, Rivno and Zhitomir regions were used as crude herbal drug. Loss on drying was determined by routine pharmacopoeian procedures. Total content of lipophilic substances was determined gravimetrically after Soxhlet extraction of samples 1, 5, 7 and 10 g in weight with methylene chloride, considering that by its extracting ability this solvent is close to liquefied difluorochloromethane (freon R22 used by us for obtaining of lipophilic complexes. The duration of complete analytical extraction was determined by infusion of six 10 g assays of lime flowers during 1, 2, 3, 4, 5, 6 hours, then quantity of lipophilic extractives was revealed gravimetrically. Quantity of essential oil in lime flowers was evaluated under the procedure of ЕР7, 2.8.12. Weight of the herbal drug sample was 200 g, distillation rate – 2,5- 3,5 ml/min, volume of distillation liquid (water – 500 ml, volume of xylene in the graduated tube – 0,50 ml. Total flavonoid content recalculated to quercetin was determined after hydrolysis with acidified acetone, withdrawing of flavonoid aglycones with ethylacetate and by further spectrophotometry of their complexes with aluminium chloride. All quantitative determinations were replicated five times for each assay. All chemicals and reagents were of analytical grade. Results and discussion. It was found that adequate accuracy of the analysis of lipophilic

  7. Development of Surface Modification Techniques for Enhanced Safety of Light Water Reactors: Recent Progress and Future Direction at THLAB

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Gwang Hyeok; Jeong, Ui Ju; Son, Hong Hyun; Jeun, Gyoo Dong; Kim, Sung Joong [Hanyang University, Daejeon (Korea, Republic of)

    2016-05-15

    They concluded that the CHF enhancement in nanofluid boiling was mainly affected by the surface characteristics of the developed layer. Furthermore, an introduction of surface modification can be utilized to secure the safety of nuclear reactor systems. At many components of the reactor systems, energetic boiling heat transfer occurs, and potential thermal attack to the systems is expected under normal or accident environments. In particular, during a reactor operation, fission energy is deposited in the fuel assemblies in a core. Also, under severe conditions, failure of a reactor vessel may occur by high temperature molten materials. In this article, we introduce the surface modification techniques and recent achievements. After a brief description of each deposition mechanism, an assessment of thermal margin for both the technologies is discussed based on pool boiling experiments conducted at THLAB. Moreover, in the latter part of each chapter, experimental facilities for applied heat transfer tests to consider reactor environments are presented.

  8. Progressive damage analysis of carbon/epoxy laminates under couple laser and mechanical loading

    Directory of Open Access Journals (Sweden)

    Wanlei Liu

    Full Text Available A multiscale model based bridge theory is proposed for the progressive damage analysis of carbon/epoxy laminates under couple laser and mechanical loading. The ablation model is adopted to calculate ablation temperature changing and ablation surface degradation. The polynomial strengthening model of matrix is used to improve bridging model for reducing parameter input. Stiffness degradation methods of bridging model are also improved in order to analyze the stress redistribution more accurately when the damage occurs. Thermal-mechanical analyses of the composite plate are performed using the ABAQUS/Explicit program with the developed model implemented in the VUMAT. The simulation results show that this model can be used to proclaim the mesoscale damage mechanism of composite laminates under coupled loading. Keywords: Laser irradiation, Multiscale analysis, Bridge model, Thermal-mechanical

  9. Aquatic Toxic Analysis by Monitoring Fish Behavior Using Computer Vision: A Recent Progress

    Directory of Open Access Journals (Sweden)

    Chunlei Xia

    2018-01-01

    Full Text Available Video tracking based biological early warning system achieved a great progress with advanced computer vision and machine learning methods. Ability of video tracking of multiple biological organisms has been largely improved in recent years. Video based behavioral monitoring has become a common tool for acquiring quantified behavioral data for aquatic risk assessment. Investigation of behavioral responses under chemical and environmental stress has been boosted by rapidly developed machine learning and artificial intelligence. In this paper, we introduce the fundamental of video tracking and present the pioneer works in precise tracking of a group of individuals in 2D and 3D space. Technical and practical issues suffered in video tracking are explained. Subsequently, the toxic analysis based on fish behavioral data is summarized. Frequently used computational methods and machine learning are explained with their applications in aquatic toxicity detection and abnormal pattern analysis. Finally, advantages of recent developed deep learning approach in toxic prediction are presented.

  10. Damage analysis and fundamental studies. Quarterly progress report, July--September 1978

    Energy Technology Data Exchange (ETDEWEB)

    Zwilsky, Klaus M.

    1979-05-01

    This report is the third in a series of Quarterly Technical Progress Reports on Damage Analysis and Fundamental Studies (DAFS) which is one element of the Fusion Reactor Materials Program, conducted in support of the Magnetic Fusion Energy Program. This report is organized along topical lines in parallel to Section II, Damage Analysis and Fundamental Studies (DOE/ET-0032/2), of the Fusion Reactor Materials Program Plan so that activities and accomplishments may be followed readily relative to that Program Plan. Thus, the work of a given laboratory may appear throughout the report. Chapters 1 and 2 report topics which are generic to all of the DAFS Program: DAFS Task Group Activities and Irradiation Test Facilities, respectively. Chapters 3, 4, and 5 report the work that is specific to each of the subtasks around which the program is structured: A) Environmental Characterization, B) Damage Production, and C) Damage Microstructure Evolution and Mechanical Behavior.

  11. Development of synchrotron x-ray micro-spectroscopic techniques and application to problems in low temperature geochemistry. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-10-01

    The focus of the technical development effort has been the development of apparatus and techniques for the utilization of X-ray Fluorescence (XRF), Extended X-ray Absorption Fine Structure (EXAFS) and X-ray Absorption Near Edge Structure (XANES) spectroscopies in a microprobe mode. The present XRM uses white synchrotron radiation (3 to 30 keV) from a bending magnet for trace element analyses using the x-ray fluorescence technique Two significant improvements to this device have been recently implemented. Focusing Mirror: An 8:1 ellipsoidal mirror was installed in the X26A beamline to focus the incident synchrotron radiation and thereby increase the flux on the sample by about a factor of 30. Incident Beam Monochromator: The monochromator has been successfully installed and commissioned in the X26A beamline upstream of the mirror to permit analyses with focused monochromatic radiation. The monochromator consists of a channel-cut silicon (111) crystal driven by a Klinger stepping motor translator. We have demonstrated the operating range of this instrument is 4 and 20 keV with 0.01 eV steps and produces a beam with a {approximately}10{sup {minus}4} energy bandwidth. The primary purpose of the monochromator is for x-ray absorption spectroscopy (XAS) measurements but it is also used for selective excitation in trace element microanalysis. To date, we have conducted XANES studies on Ti, Cr, Fe, Ce and U, spanning the entire accessible energy range and including both K and L edge spectra. Practical detection limits for microXANES are 10--100 ppM for 100 {mu}m spots.

  12. Fault Tree Analysis with Temporal Gates and Model Checking Technique for Qualitative System Safety Analysis

    International Nuclear Information System (INIS)

    Koh, Kwang Yong; Seong, Poong Hyun

    2010-01-01

    Fault tree analysis (FTA) has suffered from several drawbacks such that it uses only static gates and hence can not capture dynamic behaviors of the complex system precisely, and it is in lack of rigorous semantics, and reasoning process which is to check whether basic events really cause top events is done manually and hence very labor-intensive and time-consuming for the complex systems while it has been one of the most widely used safety analysis technique in nuclear industry. Although several attempts have been made to overcome this problem, they can not still do absolute or actual time modeling because they adapt relative time concept and can capture only sequential behaviors of the system. In this work, to resolve the problems, FTA and model checking are integrated to provide formal, automated and qualitative assistance to informal and/or quantitative safety analysis. Our approach proposes to build a formal model of the system together with fault trees. We introduce several temporal gates based on timed computational tree logic (TCTL) to capture absolute time behaviors of the system and to give concrete semantics to fault tree gates to reduce errors during the analysis, and use model checking technique to automate the reasoning process of FTA

  13. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  14. HPAT: A nondestructive analysis technique for plutonium and uranium solutions

    International Nuclear Information System (INIS)

    Aparo, M.; Mattia, B.; Zeppa, P.; Pagliai, V.; Frazzoli, F.V.

    1989-03-01

    Two experimental approaches for the nondestructive characterization of mixed solutions of plutonium and uranium, developed at BNEA - C.R.E. Casaccia, with the goal of measuring low plutonium concentration (<50 g/l) even in presence of high uranium content, are described in the following. Both methods are referred to as HPAT (Hybrid Passive-Active Technique) since they rely on the measurement of plutonium spontaneous emission in the LX-rays energy region as well as the transmission of KX photons from the fluorescence induced by a radioisotopic source on a suitable target. Experimental campaigns for the characterization of both techniques have been carried out at EUREX Plant Laboratories (C.R.E. Saluggia) and at Plutonium Plant Laboratories (C.R.E. Casaccia). Experimental results and theoretical value of the errors are reported. (author)

  15. Time-series-analysis techniques applied to nuclear-material accounting

    International Nuclear Information System (INIS)

    Pike, D.H.; Morrison, G.W.; Downing, D.J.

    1982-05-01

    This document is designed to introduce the reader to the applications of Time Series Analysis techniques to Nuclear Material Accountability data. Time series analysis techniques are designed to extract information from a collection of random variables ordered by time by seeking to identify any trends, patterns, or other structure in the series. Since nuclear material accountability data is a time series, one can extract more information using time series analysis techniques than by using other statistical techniques. Specifically, the objective of this document is to examine the applicability of time series analysis techniques to enhance loss detection of special nuclear materials. An introductory section examines the current industry approach which utilizes inventory differences. The error structure of inventory differences is presented. Time series analysis techniques discussed include the Shewhart Control Chart, the Cumulative Summation of Inventory Differences Statistics (CUSUM) and the Kalman Filter and Linear Smoother

  16. Meta-analysis of gene expression signatures defining the epithelial to mesenchymal transition during cancer progression.

    Directory of Open Access Journals (Sweden)

    Christian J Gröger

    Full Text Available The epithelial to mesenchymal transition (EMT represents a crucial event during cancer progression and dissemination. EMT is the conversion of carcinoma cells from an epithelial to a mesenchymal phenotype that associates with a higher cell motility as well as enhanced chemoresistance and cancer stemness. Notably, EMT has been increasingly recognized as an early event of metastasis. Numerous gene expression studies (GES have been conducted to obtain transcriptome signatures and marker genes to understand the regulatory mechanisms underlying EMT. Yet, no meta-analysis considering the multitude of GES of EMT has been performed to comprehensively elaborate the core genes in this process. Here we report the meta-analysis of 18 independent and published GES of EMT which focused on different cell types and treatment modalities. Computational analysis revealed clustering of GES according to the type of treatment rather than to cell type. GES of EMT induced via transforming growth factor-β and tumor necrosis factor-α treatment yielded uniformly defined clusters while GES of models with alternative EMT induction clustered in a more complex fashion. In addition, we identified those up- and downregulated genes which were shared between the multitude of GES. This core gene list includes well known EMT markers as well as novel genes so far not described in this process. Furthermore, several genes of the EMT-core gene list significantly correlated with impaired pathological complete response in breast cancer patients. In conclusion, this meta-analysis provides a comprehensive survey of available EMT expression signatures and shows fundamental insights into the mechanisms that are governing carcinoma progression.

  17. A Manual for Basic Techniques of Data Analysis and Distribution

    OpenAIRE

    Alvi, Mohsin

    2014-01-01

    A manual is designed to support and help the basic concepts of statistics and its implications in econometric, beside this, interpretation of further statistical techniques have been shown as well by illustrations and graphical methods. It is comprised on several instances of test, obtained from statistical software like SPSS, E-views, Stata and R-language with the understanding of their research models and essentials for the running the test. A basic of manual is included on two elements, fi...

  18. Analysis of kidney stones by PIXE and RBS techniques

    International Nuclear Information System (INIS)

    Alkofai, M.M.; Hallak, A.B.

    1995-01-01

    Human kidney stones were analyzed by PIXE and RBS techniques using 2 MeV He ++ beam. The stones were found to contain the elements: C, N, O, F, Na, Mg, Si, P, S, Cl, K, Ca, Fe and Br. Results obtained by PIXE agree with the results obtained by RBS within experimental errors. A Mechanism for the formation of the kidney stones is suggested. 3 figs., 1 tab

  19. Analysis of kidney stones by PIXE and RBS techniques

    Energy Technology Data Exchange (ETDEWEB)

    Alkofai, M M [Physics Dept., Yarmouk University, Irbid, (Jordan); Hallak, A B [Research Institute, King Fahd University of Petroleum and Minerals, Dhahran 31261, (Saudi Arabia)

    1995-10-01

    Human kidney stones were analyzed by PIXE and RBS techniques using 2 MeV He{sup ++} beam. The stones were found to contain the elements: C, N, O, F, Na, Mg, Si, P, S, Cl, K, Ca, Fe and Br. Results obtained by PIXE agree with the results obtained by RBS within experimental errors. A Mechanism for the formation of the kidney stones is suggested. 3 figs., 1 tab.

  20. Investigation of neutron guide systems: Analysis techniques and an experiment

    International Nuclear Information System (INIS)

    Kudryashev, V.A.

    1991-01-01

    This paper discusses the in-depth study of the specific characteristics of the physical processes associated with the total reflection of neutrons from actual reflective coatings; the study of the process whereby neutrons transit a nonideal image channel with allowance for the aforementioned characteristics, and; the development of physical criteria and techniques for calculating the optimum geometry of a neutron guide source system based on the laws found to govern this transit process

  1. Analysis of deployment techniques for webbased applications in SMEs

    OpenAIRE

    Browne, Cathal

    2011-01-01

    The Internet is no longer just a source for accessing information; it has become a valuable medium for social networking and software services. Web-browsers can now access entire software systems available online to provide the user with a range of services. The concept of software as a service(SAAS) was born out of this. The number of development techniques and frameworks for such web-applications has grown rapidly and much research and development has been carried out on adva...

  2. Diagnostic analysis of vibration signals using adaptive digital filtering techniques

    Science.gov (United States)

    Jewell, R. E.; Jones, J. H.; Paul, J. E.

    1983-01-01

    Signal enhancement techniques are described using recently developed digital adaptive filtering equipment. Adaptive filtering concepts are not new; however, as a result of recent advances in microprocessor-based electronics, hardware has been developed that has stable characteristics and of a size exceeding 1000th order. Selected data processing examples are presented illustrating spectral line enhancement, adaptive noise cancellation, and transfer function estimation in the presence of corrupting noise.

  3. Application of radioisotope techniques in analysis of environmental pollutants

    International Nuclear Information System (INIS)

    Kyrs, M.; Moravec, A.

    1984-01-01

    A survey is tabulated of the use of radioisotope techniques, giving the detected pollutant and the sensitivity and accuracy of the method. The most frequently used principle is the substoichiometric variant of isotope dilution which may be divided into the method of isotope dilution and the radio-reagent method. Both methods are described and examples are given of the determination of pollutants. (J.P.)

  4. Analysis of photoisomerizable dyes using laser absorption and fluorescence techniques

    International Nuclear Information System (INIS)

    Duchowicz, R.; Di Paolo, R.E.; Scaffardi, L.; Tocho, J.O.

    1992-01-01

    The attention of the present report has been directed mainly to the description of laser-based techniques developed in order to obtain kinetic and spectroscopic properties of polymethine cyanine dyes in solution. Special attention was dedicated to photoisomerizable molecules where the absorption spectra of both isomers are strongly overlapped. As an example, measurements of two different dyes of laser technological interest, DTCI and DODCI were performed. The developed methods provide a complete quantitative description of photophysical processes. (author). 14 refs, 6 figs

  5. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  6. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  7. Acoustic Emission Analysis of Damage Progression in Thermal Barrier Coatings Under Thermal Cyclic Conditions

    Science.gov (United States)

    Appleby, Matthew; Zhu, Dongming; Morscher, Gregory

    2015-01-01

    Damage evolution of electron beam-physical vapor deposited (EBVD-PVD) ZrO2-7 wt.% Y2O3 thermal barrier coatings (TBCs) under thermal cyclic conditions was monitored using an acoustic emission (AE) technique. The coatings were heated using a laser heat flux technique that yields a high reproducibility in thermal loading. Along with AE, real-time thermal conductivity measurements were also taken using infrared thermography. Tests were performed on samples with induced stress concentrations, as well as calcium-magnesium-alumino-silicate (CMAS) exposure, for comparison of damage mechanisms and AE response to the baseline (as-produced) coating. Analysis of acoustic waveforms was used to investigate damage development by comparing when events occurred, AE event frequency, energy content and location. The test results have shown that AE accumulation correlates well with thermal conductivity changes and that AE waveform analysis could be a valuable tool for monitoring coating degradation and provide insight on specific damage mechanisms.

  8. Improvements in analysis techniques for segmented mirror arrays

    Science.gov (United States)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  9. New trends and techniques in chromosome aberration analysis

    International Nuclear Information System (INIS)

    Bender, M.A.

    1978-01-01

    The following topics are discussed: automation of chromosome analysis; storage of fixed cells from cultures of lymphocytes obtained routinely during periodic employee medical examinations; analysis of banded chromosomes; identification of first division metaphases; sister chromatid exchange; and patterns of aberration induction

  10. Efficacy of the Principal Components Analysis Techniques Using ...

    African Journals Online (AJOL)

    Second, the paper reports results of principal components analysis after the artificial data were submitted to three commonly used procedures; scree plot, Kaiser rule, and modified Horn's parallel analysis, and demonstrate the pedagogical utility of using artificial data in teaching advanced quantitative concepts. The results ...

  11. The Analysis of Surrounding Structure Effect on the Core Degradation Progress with COMPASS Code

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Jun Ho; Son, Dong Gun; Kim, Jong Tae; Park, Rae Jun; Kim, Dong Ha [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    In line with the importance of severe accident analysis after Fukushima accident, the development of integrated severe accident code has been launched by the collaboration of three institutes in Korea. KAERI is responsible to develop modules related to the in-vessel phenomena, while other institutes are to the containment and severe accident mitigation facility, respectively. In the first phase, the individual severe accident module has been developed and the construction of integrated analysis code is planned to perform in the second phase. The basic strategy is to extend the design basis analysis codes of SPACE and CAP, which are being validated in Korea for the severe accident analysis. In the first phase, KAERI has targeted to develop the framework of severe accident code, COMPASS (COre Meltdown Progression Accident Simulation Software), covering the severe accident progression in a vessel from a core heat-up to a vessel failure as a stand-alone fashion. In order to analyze the effect of surrounding structure, the melt progression has been compared between the central zone and the most outer zone under the condition of constant radial power peaking factor. Figure 2 and 3 shows the fuel element temperature and the clad mass at the central zone, respectively. Due to the axial power peaking factor, the axial node No.3 has the highest temperature, while the top and bottom nodes have the lowest temperature. When the clad temperature reaches to the Zr melting temperature (2129.15K), the Zr starts to melt. The axial node No.2 reaches to the fuel melting temperature about 5000 sec and the molten fuel relocates to the node No.1, which results to the blockage of flow area in node No.1. The blocked flow area becomes to open about 6100 sec due to the molten ZrO{sub 2} mass relocation to core support plate. Figure 4 and 5 shows the fuel element temperature and the clad mass at the most outer zone, respectively. It is shown that the fuel temperature increase more slowly

  12. Comparative Performance Analysis of Machine Learning Techniques for Software Bug Detection

    OpenAIRE

    Saiqa Aleem; Luiz Fernando Capretz; Faheem Ahmed

    2015-01-01

    Machine learning techniques can be used to analyse data from different perspectives and enable developers to retrieve useful information. Machine learning techniques are proven to be useful in terms of software bug prediction. In this paper, a comparative performance analysis of different machine learning techniques is explored f or software bug prediction on public available data sets. Results showed most of the mac ...

  13. Air pollution studies in Tianjing city using neutron activation analysis techniques

    International Nuclear Information System (INIS)

    Ni Bangfa; Tian Weizhi; Nie Nuiling; Wang Pingsheng

    1999-01-01

    Two sites of airborne sampling from industrial and residential areas were made in Tianjing city during February and June using PM-10 sampler and analyzed by NAA techniques; Comparison of air pollution between urban and rural area in Tianjing city was made using neutron activation analysis techniques and some other data analyzing techniques. (author)

  14. Uranium solution mining cost estimating technique: means for rapid comparative analysis of deposits

    International Nuclear Information System (INIS)

    Anon.

    1978-01-01

    Twelve graphs provide a technique for determining relative cost ranges for uranium solution mining projects. The use of the technique can provide a consistent framework for rapid comparative analysis of various properties of mining situations. The technique is also useful to determine the sensitivities of cost figures to incremental changes in mining factors or deposit characteristics

  15. Progress report on neutron activation analysis at Dalat Nuclear Research Reactor

    International Nuclear Information System (INIS)

    Tuan, Nguyen Ngoc

    2003-01-01

    Neutron Activation Analysis (NAA) is one of most powerful techniques for the simultaneous multi-elements analysis. This technique has been studied and applied to analyze major, minor and trace elements in Geological, Biological and Environmental samples at Dalat Nuclear Research Reactor. At the sixth Workshop, February 8-11, 1999, Yojakarta, Indonesia we had a report on Current Status of Neutron Activation Analysis using Dalat Nuclear Research Reactor. Another report on Neutron Activation Analysis at the Dalat Nuclear Research Reactor also was presented at the seventh Workshop in Taejon, Korea from November 20-24, 2000. So in this report, we would like to present the results obtained of the application of NAA at NRI for one year as follows: (1) Determination of the concentrations of noble, rare earth, uranium, thorium and other elements in Geological samples according to requirement of clients particularly the geologists, who want to find out the mineral resources. (2) The analysis of concentration of radionuclides and nutrient elements in foodstuffs to attend the program on Asian Reference Man. (3) The evaluation of the contents of trace elements in crude oil and basement rock samples to determine original source of the oil. (4) Determination of the elemental composition of airborne particle in the Ho Chi Minh City for studying air pollution. The analytical data of standard reference material, toxic elements and natural radionuclides in seawater are also presented. (author)

  16. Progress report on neutron activation analysis at Dalat Nuclear Research Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Tuan, Nguyen Ngoc [Nuclear Research Institute, Dalat (Viet Nam)

    2003-03-01

    Neutron Activation Analysis (NAA) is one of most powerful techniques for the simultaneous multi-elements analysis. This technique has been studied and applied to analyze major, minor and trace elements in Geological, Biological and Environmental samples at Dalat Nuclear Research Reactor. At the sixth Workshop, February 8-11, 1999, Yojakarta, Indonesia we had a report on Current Status of Neutron Activation Analysis using Dalat Nuclear Research Reactor. Another report on Neutron Activation Analysis at the Dalat Nuclear Research Reactor also was presented at the seventh Workshop in Taejon, Korea from November 20-24, 2000. So in this report, we would like to present the results obtained of the application of NAA at NRI for one year as follows: (1) Determination of the concentrations of noble, rare earth, uranium, thorium and other elements in Geological samples according to requirement of clients particularly the geologists, who want to find out the mineral resources. (2) The analysis of concentration of radionuclides and nutrient elements in foodstuffs to attend the program on Asian Reference Man. (3) The evaluation of the contents of trace elements in crude oil and basement rock samples to determine original source of the oil. (4) Determination of the elemental composition of airborne particle in the Ho Chi Minh City for studying air pollution. The analytical data of standard reference material, toxic elements and natural radionuclides in seawater are also presented. (author)

  17. Assessing Progress and Pitfalls of the Millennium Development Goals in Zimbabwe: A Critical Analysis

    Directory of Open Access Journals (Sweden)

    Shepherd Mutangabende

    2016-12-01

    Full Text Available Zimbabwe adopted the Millennium Development Goals (MDGs at their inception in 2000 and it has trends of its progress in its attempt to attain these MDGs as indicated in progress reports since 2004, 2010, 2012 and 2015. In these reports optimistic trends are chiefly found in MDG2 on universal primary education which is Zimbabwe’s pride in Africa, MDG3 regarding gender parity in schools and MDG6 on HIV and AIDS. The country continues to face its biggest challenges in attaining MDG1 which is eliminating extreme poverty and hunger and MDG5 which is increase nurturing mortality, whereas all the objectives under these goals are dubious that would be attained at the cut-off date. It was unfortunate that, the inception of the MDGs coincided with the deepening of socioeconomic, political and environmental crisis in the country which made it very difficult for Zimbabwe to accomplish all of its MDGs. The focal motive of this study was to check the progress, policies, programmes and strategies which were in place to promote the attainment of the MDGs from 2000-2015 and other strategies or policies in place to attain the SDGs 2016-2030. This paper recommended that there is need for institutionalisation of SDGs that is aligning them with Zimbabwe Agenda for Sustainable Socioeconomic Transformation (Zim-Asset cluster; for instance, value accumulation and beneficiation, nourishment security, poverty extermination, social services and strengthening partnership with all stakeholders. The research uses intensive secondary data analysis from various sources including government gazette, journal articles, e-books, and government website, reports, published and unpublished books.

  18. Antioxidant agents for delaying diabetic kidney disease progression: A systematic review and meta-analysis.

    Directory of Open Access Journals (Sweden)

    Davide Bolignano

    Full Text Available Oxidative stress is a key player in the genesis and worsening of diabetic kidney disease (DKD. We aimed at collecting all available information on possible benefits of chronic antioxidant supplementations on DKD progression.Systematic review and meta-analysis.Adults with DKD (either secondary to type 1 or 2 diabetes mellitus.Cochrane CENTRAL, Ovid-MEDLINE and PubMed were searched for randomized controlled trials (RCTs or quasi-RCTs without language or follow-up restriction.Any antioxidant supplementation (including but not limited to vitamin A, vitamin C, vitamin E, selenium, zinc, methionine or ubiquinone alone or in combination.Primary outcome was progression to end-stage kidney disease (ESKD. Secondary outcomes were change in albuminuria, proteinuria, serum creatinine and renal function.From 13519 potentially relevant citations retrieved, 15 articles referring to 14 full studies (4345 participants met the inclusion criteria. Antioxidant treatment significantly decreased albuminuria as compared to control (8 studies, 327 participants; SMD: -0.47; 95% CI -0.78, -0.16 but had apparently no tangible effects on renal function (GFR (3 studies, 85 participants; MD -0.12 ml/min/1.73m2; 95% CI -0.06, 0.01. Evidence of benefits on the other outcomes of interest was inconclusive or lacking.Small sample size and limited number of studies. Scarce information available on hard endpoints (ESKD. High heterogeneity among studies with respect to DKD severity, type and duration of antioxidant therapy.In DKD patients, antioxidants may improve early renal damage. Future studies targeting hard endpoints and with longer follow-up and larger sample size are needed to confirm the usefulness of these agents for retarding DKD progression.

  19. NMR-based lipidomic analysis of blood lipoproteins differentiates the progression of coronary heart disease.

    Science.gov (United States)

    Kostara, Christina E; Papathanasiou, Athanasios; Psychogios, Nikolaos; Cung, Manh Thong; Elisaf, Moses S; Goudevenos, John; Bairaktari, Eleni T

    2014-05-02

    Abnormal lipid composition and metabolism of plasma lipoproteins play a crucial role in the pathogenesis of coronary heart disease (CHD). A (1)H NMR-based lipidomic approach was used to investigate the correlation of coronary artery stenosis with the atherogenic (non-HDL) and atheroprotective (HDL) lipid profiles in 99 patients with CHD of various stages of disease and compared with 60 patients with normal coronary arteries (NCA), all documented in coronary angiography. The pattern recognition models created from lipid profiles predicted the presence of CHD with a sensitivity of 87% and a specificity of 88% in the HDL model and with 90% and 89% in the non-HDL model, respectively. Patients with mild, moderate, and severe coronary artery stenosis were progressively differentiated from those with NCA in the non-HDL model with a statistically significant separation of severe stage from both mild and moderate. In the HDL model, the progressive differentiation of the disease stages was statistically significant only between patients with mild and severe coronary artery stenosis. The lipid constituents of lipoproteins that mainly characterized the initial stages and then the progression of the disease were the high levels of saturated fatty acids in lipids in both HDL and non-HDL particles, the low levels of HDL-phosphatidylcholine, HDL-sphingomyelin, and omega-3 fatty acids and linoleic acid in lipids in non-HDL particles. The conventional lipid marker, total cholesterol, found in low levels in HDL and in high levels in non-HDL, also contributed to the onset of the disease but with a much lower coefficient of significance. (1)H NMR-based lipidomic analysis of atherogenic and atheroprotective lipoproteins could contribute to the early evaluation of the onset of coronary artery disease and possibly to the establishment of an appropriate therapeutic option.

  20. Improved Tandem Measurement Techniques for Aerosol Particle Analysis

    Science.gov (United States)

    Rawat, Vivek Kumar

    Non-spherical, chemically inhomogeneous (complex) nanoparticles are encountered in a number of natural and engineered environments, including combustion systems (which produces highly non-spherical aggregates), reactors used in gas-phase materials synthesis of doped or multicomponent materials, and in ambient air. These nanoparticles are often highly diverse in size, composition and shape, and hence require determination of property distribution functions for accurate characterization. This thesis focuses on development of tandem mobility-mass measurement techniques coupled with appropriate data inversion routines to facilitate measurement of two dimensional size-mass distribution functions while correcting for the non-idealities of the instruments. Chapter 1 provides the detailed background and motivation for the studies performed in this thesis. In chapter 2, the development of an inversion routine is described which is employed to determine two dimensional size-mass distribution functions from Differential Mobility Analyzer-Aerosol Particle Mass analyzer tandem measurements. Chapter 3 demonstrates the application of the two dimensional distribution function to compute cumulative mass distribution function and also evaluates the validity of this technique by comparing the calculated total mass concentrations to measured values for a variety of aerosols. In Chapter 4, this tandem measurement technique with the inversion routine is employed to analyze colloidal suspensions. Chapter 5 focuses on application of a transverse modulation ion mobility spectrometer coupled with a mass spectrometer to study the effect of vapor dopants on the mobility shifts of sub 2 nm peptide ion clusters. These mobility shifts are then compared to models based on vapor uptake theories. Finally, in Chapter 6, a conclusion of all the studies performed in this thesis is provided and future avenues of research are discussed.

  1. Techniques for the thermal/hydraulic analysis of LMFBR check valves

    International Nuclear Information System (INIS)

    Cho, S.M.; Kane, R.S.

    1979-01-01

    A thermal/hydraulic analysis of the check valves in liquid sodium service for LMFBR plants is required to provide temperature data for thermal stress analysis of the valves for specified transient conditions. Because of the complex three-dimensional flow pattern within the valve, the heat transfer analysis techniques for less complicated shapes could not be used. This paper discusses the thermal analysis techniques used to assure that the valve stress analysis is conservative. These techniques include a method for evaluating the recirculating flow patterns and for selecting appropriately conservative heat transfer correlations in various regions of the valve

  2. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    Science.gov (United States)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  3. Analysis of Biomechanical Structure and Passing Techniques in Basketball

    OpenAIRE

    Ricardo E. Izzo; Luca Russo

    2011-01-01

    The basketball is a complex sport, which these days has become increasingly linked to its’ psychophysical aspects rather than to the technical ones. Therefore, it is important to make a through study of the passing techniques from the point of view of the type of the pass and its’ biomechanics. From the point of view of the type of the used passes, the most used is the two-handed chest pass with a frequency of 39.9%. This is followed, in terms of frequency, by one-handed passes – the baseball...

  4. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  5. Demonstration of innovative techniques for work zone safety data analysis

    Science.gov (United States)

    2009-07-15

    Based upon the results of the simulator data analysis, additional future research can be : identified to validate the driving simulator in terms of similarities with Ohio work zones. For : instance, the speeds observed in the simulator were greater f...

  6. 48 CFR 815.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... CONTRACTING METHODS AND CONTRACT TYPES CONTRACTING BY NEGOTIATION Contract Pricing 815.404-1 Proposal analysis... necessary for initial and revised pricing of all negotiated prime contracts, including subcontract pricing...

  7. Risk analysis of geothermal power plants using Failure Modes and Effects Analysis (FMEA) technique

    International Nuclear Information System (INIS)

    Feili, Hamid Reza; Akar, Navid; Lotfizadeh, Hossein; Bairampour, Mohammad; Nasiri, Sina

    2013-01-01

    Highlights: • Using Failure Modes and Effects Analysis (FMEA) to find potential failures in geothermal power plants. • We considered 5 major parts of geothermal power plants for risk analysis. • Risk Priority Number (RPN) is calculated for all failure modes. • Corrective actions are recommended to eliminate or decrease the risk of failure modes. - Abstract: Renewable energy plays a key role in the transition toward a low carbon economy and the provision of a secure supply of energy. Geothermal energy is a versatile source as a form of renewable energy that meets popular demand. Since some Geothermal Power Plants (GPPs) face various failures, the requirement of a technique for team engineering to eliminate or decrease potential failures is considerable. Because no specific published record of considering an FMEA applied to GPPs with common failure modes have been found already, in this paper, the utilization of Failure Modes and Effects Analysis (FMEA) as a convenient technique for determining, classifying and analyzing common failures in typical GPPs is considered. As a result, an appropriate risk scoring of occurrence, detection and severity of failure modes and computing the Risk Priority Number (RPN) for detecting high potential failures is achieved. In order to expedite accuracy and ability to analyze the process, XFMEA software is utilized. Moreover, 5 major parts of a GPP is studied to propose a suitable approach for developing GPPs and increasing reliability by recommending corrective actions for each failure mode

  8. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  9. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente

    2005-01-01

    aqueous selenium standards were separated within 1.2 min on a 1.00 id x 50 mm reversed phase column in an ion-pair chromatographic system using a flow rate of 200 mu L min(-1). Hence, analysis times could be reduced to 1/10 compared with ordinary HPLC for aqueous standards. The precision and detection...... the use of short columns. Hence, analysis times could be halved without loss of separation efficiency in this biological sample...

  10. First characterization of the expiratory flow increase technique: method development and results analysis

    International Nuclear Information System (INIS)

    Maréchal, L; Barthod, C; Jeulin, J C

    2009-01-01

    This study provides an important contribution to the definition of the expiratory flow increase technique (EFIT). So far, no measuring means were suited to assess the manual EFIT performed on infants. The proposed method aims at objectively defining the EFIT based on the quantification of pertinent cognitive parameters used by physiotherapists when practicing. We designed and realized customized instrumented gloves endowed with pressure and displacement sensors, and the associated electronics and software. This new system is specific to the manoeuvre, to the user and innocuous for the patient. Data were collected and analysed on infants with bronchiolitis managed by an expert physiotherapist. The analysis presented is realized on a group of seven subjects (mean age: 6.1 months, SD: 1.1; mean chest circumference: 44.8 cm, SD: 1.9). The results are consistent with the physiotherapist's tactility. In spite of inevitable variability due to measurements on infants, repeatable quantitative data could be reported regarding the manoeuvre characteristics: the magnitudes of displacements do not exceed 10 mm on both hands; the movement of the thoracic hand is more vertical than the movement of the abdominal hand; the maximum applied pressure with the thoracic hand is about twice higher than with the abdominal hand; the thrust of the manual compression lasts (590 ± 62) ms. Inter-operators measurements are in progress in order to generalize these results

  11. Second year progress report on the co-ordinated research programme on signal processing techniques for sodium boiling noise detection

    International Nuclear Information System (INIS)

    Singh, O.P.; Harish, R.; Prabhakar, R.; Reddy, C.P.; Srinivasan, G.S.; Vyjayanthi, R.K.

    1989-01-01

    The present paper deals with the second stage of investigations of acoustic signals from a boiling experiment performed on the KNS I loop at KfK Karlsruhe and first results of analysis of data from a series of boiling experiments carried out in the BOR 60 reactor in the USSR. Signals have been analysed in frequency as well as in time domain. Signal characteristics successfully used to detect the boiling process have been found in time domain. A proposal for in-service boiling monitoring by acoustic means is briefly described. (author). 9 refs, 22 figs, 19 tabs

  12. Software failure events derivation and analysis by frame-based technique

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih, C.; Yih, Swu; Chen, M.-H.

    2007-01-01

    A frame-based technique, including physical frame, logical frame, and cognitive frame, was adopted to perform digital I and C failure events derivation and analysis for generic ABWR. The physical frame was structured with a modified PCTran-ABWR plant simulation code, which was extended and enhanced on the feedwater system, recirculation system, and steam line system. The logical model is structured with MATLAB, which was incorporated into PCTran-ABWR to improve the pressure control system, feedwater control system, recirculation control system, and automated power regulation control system. As a result, the software failure of these digital control systems can be properly simulated and analyzed. The cognitive frame was simulated by the operator awareness status in the scenarios. Moreover, via an internal characteristics tuning technique, the modified PCTran-ABWR can precisely reflect the characteristics of the power-core flow. Hence, in addition to the transient plots, the analysis results can then be demonstrated on the power-core flow map. A number of postulated I and C system software failure events were derived to achieve the dynamic analyses. The basis for event derivation includes the published classification for software anomalies, the digital I and C design data for ABWR, chapter 15 accident analysis of generic SAR, and the reported NPP I and C software failure events. The case study of this research includes: (1) the software CMF analysis for the major digital control systems; and (2) postulated ABWR digital I and C software failure events derivation from the actual happening of non-ABWR digital I and C software failure events, which were reported to LER of USNRC or IRS of IAEA. These events were analyzed by PCTran-ABWR. Conflicts among plant status, computer status, and human cognitive status are successfully identified. The operator might not easily recognize the abnormal condition, because the computer status seems to progress normally. However, a well

  13. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  14. A Comparison of seismic instrument noise coherence analysis techniques

    Science.gov (United States)

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  15. Advances in dynamic relaxation techniques for nonlinear finite element analysis

    International Nuclear Information System (INIS)

    Sauve, R.G.; Metzger, D.R.

    1995-01-01

    Traditionally, the finite element technique has been applied to static and steady-state problems using implicit methods. When nonlinearities exist, equilibrium iterations must be performed using Newton-Raphson or quasi-Newton techniques at each load level. In the presence of complex geometry, nonlinear material behavior, and large relative sliding of material interfaces, solutions using implicit methods often become intractable. A dynamic relaxation algorithm is developed for inclusion in finite element codes. The explicit nature of the method avoids large computer memory requirements and makes possible the solution of large-scale problems. The method described approaches the steady-state solution with no overshoot, a problem which has plagued researchers in the past. The method is included in a general nonlinear finite element code. A description of the method along with a number of new applications involving geometric and material nonlinearities are presented. They include: (1) nonlinear geometric cantilever plate; (2) moment-loaded nonlinear beam; and (3) creep of nuclear fuel channel assemblies

  16. Nigerian coal analysis by PIXE and HEBS techniques

    International Nuclear Information System (INIS)

    Olabanji, S.O.

    1989-05-01

    PIXE and HEBS techniques were employed for the measurement of the concentrations of the major, minor and trace elements in Nigerian coal samples from a major deposit. The samples were irradiated with 2.55 MeV protons from the 3 MeV tandem accelerator (NEC 3 UDH) in Lund. The PIXE results are reported and compared with an earlier work on Nigerian coal using FNAA and INAA analytical techniques while the HEBS results are compared with ASTM previous results. The results corroborate the assertion that Nigerian coals are of weak and noncoking grades with low sulphur (0.82-0.99%) and relatively high hydrogen (4.49-5.16%) contents. The motivation for this work is partly due to the projected usage of coal as metallurgical feedstocks and as fuel, and partly because of the genuine concern about the concomitant environmental effects of the increased burning of coal. The knowledge of the concentration of all elements is important for the characterization of coal and the determination and control of its products. Economic parameters such as the ash contents and calorific values are associated with the concentrations of coal's constituents. (author). 11 refs, 1 fig., 4 tabs

  17. Improved analysis techniques for cylindrical and spherical double probes

    Energy Technology Data Exchange (ETDEWEB)

    Beal, Brian; Brown, Daniel; Bromaghim, Daron [Air Force Research Laboratory, 1 Ara Rd., Edwards Air Force Base, California 93524 (United States); Johnson, Lee [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Dr., Pasadena, California 91109 (United States); Blakely, Joseph [ERC Inc., 1 Ara Rd., Edwards Air Force Base, California 93524 (United States)

    2012-07-15

    A versatile double Langmuir probe technique has been developed by incorporating analytical fits to Laframboise's numerical results for ion current collection by biased electrodes of various sizes relative to the local electron Debye length. Application of these fits to the double probe circuit has produced a set of coupled equations that express the potential of each electrode relative to the plasma potential as well as the resulting probe current as a function of applied probe voltage. These equations can be readily solved via standard numerical techniques in order to determine electron temperature and plasma density from probe current and voltage measurements. Because this method self-consistently accounts for the effects of sheath expansion, it can be readily applied to plasmas with a wide range of densities and low ion temperature (T{sub i}/T{sub e} Much-Less-Than 1) without requiring probe dimensions to be asymptotically large or small with respect to the electron Debye length. The presented approach has been successfully applied to experimental measurements obtained in the plume of a low-power Hall thruster, which produced a quasineutral, flowing xenon plasma during operation at 200 W on xenon. The measured plasma densities and electron temperatures were in the range of 1 Multiplication-Sign 10{sup 12}-1 Multiplication-Sign 10{sup 17} m{sup -3} and 0.5-5.0 eV, respectively. The estimated measurement uncertainty is +6%/-34% in density and +/-30% in electron temperature.

  18. Characteristic vector analysis of inflection ratio spectra: New technique for analysis of ocean color data

    Science.gov (United States)

    Grew, G. W.

    1985-01-01

    Characteristic vector analysis applied to inflection ratio spectra is a new approach to analyzing spectral data. The technique applied to remote data collected with the multichannel ocean color sensor (MOCS), a passive sensor, simultaneously maps the distribution of two different phytopigments, chlorophyll alpha and phycoerythrin, the ocean. The data set presented is from a series of warm core ring missions conducted during 1982. The data compare favorably with a theoretical model and with data collected on the same mission by an active sensor, the airborne oceanographic lidar (AOL).

  19. Identifying Indicators of Progress in Thermal Spray Research Using Bibliometrics Analysis

    Science.gov (United States)

    Li, R.-T.; Khor, K. A.; Yu, L.-G.

    2016-12-01

    We investigated the research publications on thermal spray in the period of 1985-2015 using the data from Web of Science, Scopus and SciVal®. Bibliometrics analysis was employed to elucidate the country and institution distribution in various thermal spray research areas and to characterize the trends of topic change and technology progress. Results show that China, USA, Japan, Germany, India and France were the top countries in thermal spray research, and Xi'an Jiaotong University, Universite de Technologie Belfort-Montbeliard, Shanghai Institute of Ceramics, ETH Zurich, National Research Council of Canada, University of Limoges were among the top institutions that had high scholarly research output during 2005-2015. The terms of the titles, keywords and abstracts of the publications were analyzed by the Latent Dirichlet Allocation model and visually mapped using the VOSviewer software to reveal the progress of thermal spray technology. It is found that thermal barrier coating was consistently the main research area in thermal spray, and high-velocity oxy-fuel spray and cold spray developed rapidly in the last 10 years.

  20. Does colon cancer ever metastasize to bone first? a temporal analysis of colorectal cancer progression

    International Nuclear Information System (INIS)

    Roth, Eira S; Fetzer, David T; Barron, Bruce J; Joseph, Usha A; Gayed, Isis W; Wan, David Q

    2009-01-01

    It is well recognized that colorectal cancer does not frequently metastasize to bone. The aim of this retrospective study was to establish whether colorectal cancer ever bypasses other organs and metastasizes directly to bone and whether the presence of lung lesions is superior to liver as a better predictor of the likelihood and timing of bone metastasis. We performed a retrospective analysis on patients with a clinical diagnosis of colon cancer referred for staging using whole-body 18 F-FDG PET and CT or PET/CT. We combined PET and CT reports from 252 individuals with information concerning patient history, other imaging modalities, and treatments to analyze disease progression. No patient had isolated osseous metastasis at the time of diagnosis, and none developed isolated bone metastasis without other organ involvement during our survey period. It took significantly longer for colorectal cancer patients to develop metastasis to the lungs (23.3 months) or to bone (21.2 months) than to the liver (9.8 months). Conclusion: Metastasis only to bone without other organ involvement in colorectal cancer patients is extremely rare, perhaps more rare than we previously thought. Our findings suggest that resistant metastasis to the lungs predicts potential disease progression to bone in the colorectal cancer population better than liver metastasis does